Minha Visão

2150-2199 Future Timeline | Timeline | Technology | Singularity | 2020 | 2050 | 2100 | 2150 | 2200 | 21st century | 22nd century | 23rd century | Humanity | Predictions

2150-2199 Future Timeline | Timeline | Technology | Singularity | 2020 | 2050 | 2100 | 2150 | 2200 | 21st century | 22nd century | 23rd century | Humanity | Predictions 1
cupom com desconto - o melhor site de cupom de desconto cupomcomdesconto.com.br


 

 

 

2150

Interstellar
exploration is becoming common

By the mid-22nd century, a wide variety of probes to neighbouring star systems have successfully reached their destinations.* The fastest
of these can now achieve a significant fraction of light speed, requiring
only a few decades of travel time. By way of comparison, space probes of a century earlier – such as the Voyager missions – will take many thousands of years to reach the stars.

A number of different engine systems are being utilised – from antimatter, to nuclear pulse
propulsion and other more experimental methods. Each of these craft is equipped with powerful
AI, heavily automated systems and robots/androids. Protection
from incoming debris is offered by cone-shaped force fields projected
from the front of each craft. This streamlined shape causes objects like asteroids to drift by without causing any damage.

After journeying for trillions of miles, the majority of probes successfully
rendezvous with their destinations. These return a treasure trove of
data and visual information on extrasolar planets. In addition, images are taken of Earth and the Solar System viewed from light years away, providing a new perspective of humanity and its place in the universe.

 

 

 

Holodeck-style environments are becoming possible

The concept of virtual reality had been explored as far back as the 1930s, when Stanley G. Weinbaum wrote his short story, Pygmalion’s Spectacles. Published in Wonder Stories – an American science fiction magazine – this described a “mask” with holographic recording of experiences including smell, taste and touch.*

In the 1950s, Morton Heilig wrote of an “Experience Theatre” that could encompass all the senses in an effective manner, thus drawing the viewer into the onscreen activity. He built a prototype mechanical device called the Sensorama in 1962 – which had five short films displayed in it while engaging multiple senses (sight, sound, smell, and touch). Around this time, engineer and inventor Douglas Englebart began using computer screens as both input and output devices.

Later in the 20th century, the term “virtual reality” was popularised by Jaron Lanier. A major pioneer of the field, he founded VPL Research in 1985, which developed and built some of the seminal “goggles and gloves” systems of that decade.

A more advanced concept was depicted in TV shows like Star Trek: The Next Generation (1987-1994) and movies like The Matrix (1999). These introduced the idea of simulated realities that were convincing enough to be indistinguishable from the real world.

It was not until the late 2010s that virtual reality became a truly mainstream consumer technology.* By then, exponential advances in computing power had solved many issues hindering previous, cruder attempts at VR – such as cost, weight/bulkiness, pixel resolution and screen latency. It was possible to combine these headsets with circular or hexagonal treadmills, offering users the ability to walk in a seamless open world.* The Internet also enabled participants from around the globe to compete and engage with each other in massively multiplayer online role-playing games.

While clearly a huge improvement over earlier generations of hardware, these devices would pale into insignificance when compared to full immersion virtual reality (FIVR). As computers became ever smaller and more compact, made possible via new materials such as graphene, they were beginning to integrate with the human body in ways hitherto impossible. Their components had shrunk by orders of magnitude, following trends like Moore’s Law. Machines that filled entire rooms in the 1970s had become smartphones by 2010 and the size of blood cells by the 2030s.* This occurred in parallel with accurate models of the brain, establishing a basic roadmap of neurological processes.* Full immersion virtual reality leveraged these advances to create microscopic devices able to record and mimic patterns of thought, directly inside the brain. Tens of billions of these “nanobots” could be programmed to function simultaneously, like a miniature Internet, the end result being that sensory information was now reproducible through software. In other words – vision, hearing, smell, taste and touch could be temporarily substituted by a computer program, allowing users to experience virtual environments with sufficient detail to match the real world. First demonstrated in laboratory settings and military training environments, FIVR was commercialised in subsequent decades and became one of the 21st century’s defining technologies.

Not everyone was amenable to having nanoscale machines inserted into their brains, however. In any case, full immersion VR provided only a superficial imitation of real life – it could not replicate every subatomic particle, for example, or the countless quantum events occurring at any given moment in time and space. Accounting for these phenomena would require a level of computing on a different scale entirely.

 

 

Lattice Quantum Chromodynamics (LQCD) was a promising field in the late 20th and early 21st centuries. This allowed researchers to simulate objects and processes in near-perfect detail, using resolutions based on the fundamental physical laws. By the 2010s, for example, individual proton masses could be determined at error margins close to one percent. During the 2020s, exascale computing helped to further refine the nuclear forces and uncover exotic “new physics” beyond the Standard Model.

READ  Nanotubo de carbono de transistores de fazer o salto de laboratório para o chão-de-fábrica

Smaller and smaller pixelations were being applied to greater and greater volumes of space-time, as supercomputers later reached the zettascale, yottascale and beyond. By the 2070s, it was possible to simulate a complete virus with absolute accuracy down to the smallest known quantum level.* Blood cells, bacteria and other living structures followed as this technique approached the macroscale. In the early 22nd century, mind transfer became feasible for mainstream use, whole-brain scans now sufficiently perfected. Another milestone was passed by 2140, with a cubic metre of space-time being accurately simulated.**

These four-dimensional lattice grids were, in effect, miniature universes – fully programmable and controllable. When combined with artificial intelligence, matter contained within their boundaries could be used to recreate virtually anything in real time and real resolution. Spatial extents continued to grow, reaching tens of metres. Although highly convincing VR had been around for over a century, achieving this level of detail at these scales had been impossible until now. By 2150, perfect simulations can be generated in room-sized environments without any requirement for on-person hardware.

As virtual reality advances still further, entire worlds are constructed using the smallest quantum units for building blocks. This opens up some profound opportunities in the 23rd century. For example, artificial planet Earths can have their parameters altered slightly – gravity, mass, temperature and so on – then fast-forwarded billions of years to compare the outcomes. Intelligent species evolving on these virtual worlds may be entirely unaware that they are part of a giant simulation.

 

 

 

Hi-tech,
automated cities

An
observer from the previous century, walking through a newly developed
city of 2150, would be struck by the sense of cleanliness and order.
The air would smell fresh and pure, as if they were in pre-industrial countryside. Roads and pavements would be immaculate: made of special
materials that cleaned themselves, absorbed garbage and could self-repair
in the event of damage. Building surfaces, windows and roofs would be
completely resistant to dirt, bacteria, weather, graffiti and vandalism.
These same coatings would be applied to public transport, cars and other
vehicles. Everything would appear brand new, shiny and in perfect condition
at all times. Greenery would feature heavily in this city, alongside spectacular fountains, sculptures and other beautification.

Lamp posts,
telegraph poles, signs, bollards and other visual “clutter” that
once festooned the streets have disappeared. Lighting is achieved
more discretely, using a combination of self-illuminating walls and
surfaces, antigravity and other features designed to hide these eyesores,
maximising pedestrian space and aesthetics. Electricity is passed wirelessly
from building to building. Room temperature superconductors – implanted
in the ground – allow the rapid movement of vehicles without the need
for tracks, wheels, overhead cables or other such components. Cars
and trains simply drift along silently, riding on electromagnetic currents.

 

 

Sign posts
are obsolete – all information is beamed into a person’s
visual cortex. They merely have to “think” of a particular
building, street or route to be given information about it.

This observer
would also notice their increased personal space, and the relative quiet
of areas that, in earlier times, would have bustled with cars, people
and movement. In some places, robots tending to manual duties might
outnumber humans. This is partly a result of the reduction
in the world’s population. However, it is also because citizens of today
spend the majority of their time in virtual environments. These offer practically everything a person
needs in terms of knowledge, communication and interaction – often
at speeds much greater than real time. Limited only by a person’s
imagination, they can provide richer and more stimulating experiences
than just about anything in the physical world.

On those
rare occasions when a person ventures outside, they are likely to spend
little time on foot. Almost all services and material needs can be obtained
within the home, or practically on their doorstep – whether it be food,
medical assistance, or even replacement body parts and physical upgrades. Social
gatherings in the real world are infrequent, usually reserved
for “special” occasions such as funerals, for novelty value,
or the small number of situations where VR is impractical.

Crime is
almost non-existent in these hi-tech cities. Surveillance is everywhere:
recording every footstep of your journey in perfect detail and identifying
who you are, from the moment you enter a public area. Even your internal
biological state can be monitored – such as neural activity and pulse
– giving clues as to your immediate intentions. Police can be summoned
instantly, with robotic officers appearing to ‘grow’ out of the ground
through the use of blended claytronics and nanobots, embedded into the
buildings and roads. This is so much faster and more efficient that
in most cities, having law enforcement drive or fly to a crime area
(in physical vehicles) has become obsolete.

READ  a & e | O rapper modernista Saul Paul «Kurzweil
cupom com desconto - o melhor site de cupom de desconto cupomcomdesconto.com.br

Although
safe and clean, some of these hi-tech districts might appear rather
sterile to an observer from the previous century. They would lack the
grit, noise and character which defined cities in past times. One way
that urban designers are overcoming this problem is through the use
of dynamic surfaces. These create physical environments that are interactive.
Certain building façades, for instance, can change their appearance
to match the tastes of the observer. This can be achieved via augmented
reality (which only the individual is aware of), claytronic surfaces
and holographic projections (which everybody can see), or a combination
of the two. A bland glass and steel building could suddenly morph into
a classical style, with Corinthian columns and marble floors; or it
could change to a red brick texture, depending on the mood or situation.

 


2151

Total
solar eclipse in London

A
rare total eclipse takes place in Britain this year, with parts of London
experiencing totality.* The last time this
occurred was in 1715; the next will be in 2600 AD.

 

Credit: NASA

 


2160

Mass
extinctions are levelling off

A
century has passed since the peak in global extinction rates* and biodiversity has now stabilised. With previous food chains having collapsed, the world’s
fauna is dominated by the hardiest and most adaptable lifeforms – such as rats, cockroaches and canines – while plant life has seen a marked increased in the proportion of weeds.

Throughout
the world lie abandoned cities and decaying infrastructure surrounded
by vast wastelands. Small pockets of biodiversity
can still be found – but many of these are contained within artificial
environments, protected and sealed from conditions outside.
Much of humanity has fled to higher and lower latitudes while efforts continue to resolve the climate crisis.

 

 

 

The
world’s first bicentenarians

Certain
people who were born in the 1960s are still alive and well in today’s
world. Life expectancy had been increasing at a rate of 0.2 years, per
year, at the turn of the 21st century. This incremental progress meant
that by the time they were 80, these people could expect to live an
additional decade on top of their original lifespan.

However,
the rate of increase itself had been accelerating, due to major breakthroughs
in medicine and healthcare, combined with better education and lifestyle
choices. This created a “stepping stone”, allowing people
to buy time for the treatments available later in the century – which
included being able to halt the aging process altogether.*

 


 

2170

Underground lunar cities are widespread

In the second half of the 21st century, the emergence of low-cost space travel* made it relatively easy to access the Moon and its surface. This led to a surge of new explorers, entrepreneurs and cooperatives looking to make their mark on its undeveloped territories. Now, almost a hundred years after these early settlers, the lunar environment has become a hive of activity.

Lava tunnels – among the most sought-after locations – had formed in the ancient past by molten rock flowing underground, either from volcanic activity or the impact of comets and asteroids causing terrain to melt. These left behind enormous caves, with some of the largest found to be over 100 kilometres in length and several kilometres wide.

Telescopic observations, orbiting probes, and landers in the 20th and 21st centuries revealed more and more information about these regions. Eventually it became possible to send fleets of automated drones and other vehicles below the surface – producing detailed 3D maps with imaging techniques such as LiDAR* and muon tomography. The latter had been used with great success on Earth; to uncover hidden chambers in the Egyptian Pyramids, for example; and to detect nuclear material in vehicles and cargo containers for the purposes of non-proliferation; and to monitor potential underground sites used for carbon sequestration. Its usage on the Moon allowed deep scanning of tunnels at high spatial resolution.*

By 2100, the Moon’s surface and subsurface had been thoroughly mapped, with a large and growing number of commercial operations being established to secure prime real estate.

 

Lunar lava tube on the Moon. Credit: NASA

 

The popularity of these underground sites owed in part to their shielding from radiation, micrometeorites and temperature extremes. Sealing of the “skylight” openings with an inflatable module, designed to form a hard outer shell, provided a further layer of protection – and the potential to create a pressurised, breathable atmosphere.

In addition to establishing a safe environment, a number of useful resources could be extracted from these depths. Titanium, for example, appeared in concentrations of 10% or more,* whereas the highest yields on Earth rarely exceeded 3%. The tunnels also contained rare minerals, which had formed as the lava slowly cooled and differentiated. In the polar regions, some tunnels led to and provided easier access to frozen water deposits.*

READ  Zapata Computação CEO Chris Sabóia Entrevista em Tornar Real o Quantum Solutions – NextBigFuture.com

As the decades passed, the expansion of underground colonies began to accelerate. The initial settlements, containing the bare essentials in terms of food, water, oxygen production and habitat modules, evolved into towns with their own culture and identity.

Structural engineers, having assessed the arch-like roofs overhead, found them to be stable even at widths of several kilometres.* On Earth, lava tubes were unable to form at such enormous sizes, but the Moon’s lower gravity (0.16 g) and lack of weathering or erosion made this possible.

 

 

The Moon began to attract more and more residents, seeking a life away from Earth and the chance to form a new society. As a response to the environmental crisis and humanity’s ever-growing footprint, a “deindustrialisation” movement had been gathering pace. This aimed to reduce the burden on Earth’s ecosystems by “offworlding” many traditional production/extraction/manufacturing operations to the lunar environment.

By 2170* – two hundred years since humanity first set foot on the Moon – these lava tunnels are filled with entire cities, home to many millions of people and their robot/AI companions. Much of the infrastructure (including some very large supercomputers) has been imported from Earth, as part of the aforementioned deindustrialisation projects. Most of the original cave entrances, made from inflatable materials, have now been upgraded into fully-fledged airlocks for handling the rapid arrival and departure of large spacecraft.*

In some of the largest and deepest caverns, the lunar conditions allow for a number of gigantic science experiments. For instance, neutrino detectors are being built at scales and efficiencies unmatched on Earth,* taking advantage of the greater isolation from background interference. These are revealing profound insights into astronomical phenomena and the nature of the Universe.

Ongoing expansion of the “datasphere” – defined as the world’s aggregate of generated and stored data – continues to drive the growth of computing and related technologies in the 22nd century. With more and more space required to house data centres and supercomputers, the Moon has become a major focus of activity in terms of fulfilling this need. Formerly separate cave systems are now interconnected, forming hyper-fast networks across a significant portion of the Moon’s surface and subsurface.

 


 

2190

Nitrous oxide (N2O) has fallen to pre-industrial levels

Nitrous oxide (N2O) is a naturally occurring gas emitted by bacteria in the soils and oceans, forming part of the Earth’s nitrogen cycle. It was first synthesised by English natural philosopher and chemist Joseph Priestley in 1772. From the Industrial Revolution onwards, human activities began to significantly increase the amount of N2O in the atmosphere. By the early 21st century, about 40% of total emissions were man-made.

By far the largest anthropogenic source (80%) was from agriculture and the use of synthetic fertilizers to improve soil, as well as the breakdown of nitrogen in livestock manure and urine. Industrial sources included production of chemicals such as nylon, internal combustion engines for transport and oxidizers in rocketry. Known as “laughing gas” due to its euphoric effects, it was also used in surgery and dentistry for anaesthetics and analgesics.

Nitrous oxide was found to be a powerful greenhouse gas – the third most important after carbon dioxide and methane. While not as abundant in the atmosphere as carbon dioxide (CO2), it had almost 300 times the heat-trapping ability per molecule and caused roughly 10 percent of global warming. After the banning of chlorofluorocarbons (CFCs) in the 1980s, it also became the most important substance in stratospheric ozone depletion.*

By the mid-21st century, the effects of global warming had become very serious.* While most efforts were focussed on mitigating CO2, attempts were made to address the imbalance of other greenhouse gases, including N20. There was no “silver bullet” for this. Instead, it would take a combination of substantial improvements in agricultural efficiency, reduced emissions in transportation and industrial sectors, along with changes in dietary habits towards less per capita meat consumption in the developed world. While many technologies and innovations were already available in earlier decades, these targets were unfortunately difficult to achieve – due to additional costs and the absence of political will for implementation. It was only during the catastrophic events in the second half of the century that sufficient efforts and financial resources were directed towards the problem.

With a lifespan of 114 years,* man-made N20 proved difficult to stabilise and remained in the atmosphere well into the 22nd century. By 2190, it has fallen to around 270 parts per billion (ppb), its pre-industrial level.** As well as halting the impact of global warming and ozone damage, other benefits include better overall air quality, reduced loss of biodiversity in eutrophied aquatic and terrestrial ecosystems, and economic benefits.

 

 

 

 

 

cupom com desconto - o melhor site de cupom de desconto cupomcomdesconto.com.br

Deixe uma resposta

O seu endereço de e-mail não será publicado. Campos obrigatórios são marcados com *