Entity of balance 6

the 6th entity of balance appears in the year 2739, so here is everything that happens until then,

The future:

2024: Open-source, 3D printed clothes at near-zero cost. 3D printing – having emerged as a mainstream consumer technology – is now so cheap, fast, and easy to use that it can produce items of clothing for just a few cents. A milestone was passed in 2014 when 3D printing became faster than injection molding. The speed of printing continued to increase, doubling every two years in a trend similar to Moore's Law. By 2024, it is over 30 times faster, so an item which took four hours to print in 2014 now takes just seven and a half minutes. Millions of open-source designs are available to download. Sweatshops in the developing world are declining as a result, with low-paid factory jobs made increasingly obsolete. The Thirty Meter Telescope is fully operational. The Thirty Meter Telescope (TMT) is a huge new observatory built on the summit of Mauna Kea in Hawaii, USA. It is funded through an international collaboration between governments and scientific institutions in Canada, China, India, Japan, and the United States. The TMT operates in the near-ultraviolet to mid-infrared (0.31 to 28 μm wavelengths) part of the spectrum and is designed as a general-purpose observatory for investigating a broad range of astronomical phenomena. The centerpiece of the building is a Ritchey-Chrétien telescope with a 30-metre (98 feet) diameter primary mirror, which is segmented and consists of 492 smaller (1.4 m) hexagonal mirrors. The shape of each segment– as well as its position relative to neighboring segments – can be controlled actively. The mirror is housed in a dome with a diameter of 66 meters (217 feet) and height of 55 meters (180 feet), comparable to an 18-storey building. Among the existing and planned telescopes of 20 meters or larger, the TMT is located at the highest altitude, sitting 4,050 meters (13,290 feet) above sea level, which provides exceptional clarity of night sky objects. Even greater sharpness is achieved by its adaptive optics system, which helps correct image blur caused by the Earth's atmosphere. Extremely high contrast exoplanet imaging is therefore possible. It can detect Earthlike planets around distant stars and take spectroscopy of those worlds to analyze the potential for life in greater detail than ever before. The TMT's other capabilities include revealing the structure of hidden dark matter, which is believed to account for 27% of the total mass-energy content of the known universe. The nature of "first-light" objects can also be determined by peering far back into the young universe. The early formation and evolution of the large-scale structures that dominate the present-day universe can also be observed. In addition, supermassive black holes can be analyzed at very high resolution. This allows scientists to measure the general relativistic effects and to spatially resolve the accretion disks for active black holes in the centers of galaxies to the distance of the Virgo cluster, around 55 million light years away. The TMT mirror has a collecting area nine times greater than the neighboring Keck Telescope, with a spatial resolution over 12 times sharper than the Hubble Space Telescope. Indonesia gets a new capital city. In 2024, a new capital city is established for Indonesia – its fourth in the last 80 years. Following World War II, Jakarta had become the de facto capital of the Republic of Indonesia. During occupation by the Netherlands Indies Civil Administration (NICA), the capital was moved to Yogyakarta in 1946 and then later Bukittinggi in 1948. It reverted back to Yogyakarta in 1949. Once independence was secured, Jakarta again became the capital in 1950, a position it would hold for the next 74 years. In 2019, President Joko Widodo officially announced a relocation of Indonesia's capital to East Kalimantan on the island of Borneo. The as-yet-unnamed city – costing tens of billions of dollars to plan and build – was needed due to major subsidence in Jakarta. Due to the combined effects of its location on swampy land, and over-extraction of groundwater, Jakarta had been sinking at an incredible rate: up to 25cm (10in) each year in some of the worst affected parts, and almost half now sat below sea level. Jakarta also suffered from dire pollution, overcrowding and traffic problems. The new capital, on a plot of 40,000 hectares (400 sq km), would be surrounded by greenery in a sparsely populated area, yet at the same time positioned in a more central and geographically unifying part of the country. One fifth of the approximately $34 billion needed would be financed by the state, with international investors contributing the remainder. The government pledged $40 billion to modernize and prevent the sinking of Jakarta, insisting that it was "not abandoning" the city. Without this action, 95% of Jakarta faced inundation by 2050. Indonesia's Ministry of Public Works and Public Housing organized a design contest, with Nagara Rimba Nusa ('Forest Archipelago') by URBAN+ announced as the winner in December 2019. The masterplan featured strong environmental credentials and the latest in "smart city" technology. It would be developed in phases with construction starting in 2021, official inauguration taking place in 2024 and the last remaining government departments in Jakarta being transferred across by the end of the decade. Total solar eclipse across North America. A total solar eclipse occurs on 8th April 2024, visible across North America. For observers on the continent, totality begins at the Pacific coast, moving in a northeasterly direction through Mexico, the USA, and Canada, before ending in the Atlantic Ocean. Its longest duration is four minutes and 28 seconds near the small town of Nazas, Durango, Mexico, and the nearby city of Torreón, Coahuila. This is the first total solar eclipse to be visible from Canada since 1979, the first in Mexico since 1991 and the first in the USA since 2017. It is the only total solar eclipse in the 21st century where totality is visible in all three countries. The path of this eclipse crosses the path of the prior total solar eclipse of 21st August 2017, with the intersection of the two paths being in southern Illinois, in Makanda, just south of Carbondale. The cities of Benton, Carbondale, Chester, Harrisburg, Marion, and Metropolis in Illinois; Cape Girardeau, Farmington, and Perryville in Missouri, as well as Paducah, Kentucky, are within a 9,000 square mile intersection of the paths of totality of both the 2017 and 2024 eclipses – therefore earning the rare distinction of being witness to two total solar eclipses within a span of seven years. Euro 2024 is hosted by Germany. Euro 2024 is the 17th edition of the European Championship, a quadrennial tournament organized by UEFA. It is hosted by Germany – the first time it has performed this role as a unified country (West Germany had hosted the games in 1988). 51 matches are played over a period of 32 days in June and July with 24 teams competing in the tournament. During the bid process, only two countries had put themselves forward as candidates. Alongside Germany was Turkey. However, the latter's poor record in the area of human rights, its limited hotel capacity in many cities and the scale of infrastructure required were all of concern. Germany was announced as the winner by 12 votes to 4, with one abstention, on 27th September 2018. As hosts, the German team qualifies automatically. This provides a boost to German football after a disastrous 2018 World Cup, when the country failed to qualify for the last 16. A plethora of stadia were available to satisfy UEFA's minimum capacity requirement of 40,000 seats. A total of 18 cities and stadia had submitted proposals, including the 12 hosts of the 2006 FIFA World Cup, but only ten cities and stadiums are used for the competition, the largest being the 75,000-capacity Olympiastadion in Berlin. Paris hosts the Summer Olympic Games. From 2nd to 18th August 2024, the French capital hosts the 33rd Summer Olympics. Paris becomes the second city after London (1908, 1948 and 2012) to host the Olympic Games on three occasions. 2024 also marks the centennial of the 1924 Summer Olympics, which were held in the same city, and was the last time Paris held the Olympic Games. Bidding to host the Games started in 2015 with five candidate cities, but Hamburg, Rome and Budapest withdrew, leaving Paris and Los Angeles as the only remaining candidates. A proposal to elect the 2024 and 2028 Olympic host cities at the same time was approved by an Extraordinary IOC Session in July 2017, in Switzerland. The IOC made a deal with Los Angeles to host the 2028 Games, which made Paris the host of the 2024 Games. The formal announcement of the hosts for both Olympiads took place at the 131st IOC Session in Lima, Peru, on 13th September 2017. Lunar Mission One drills into the Moon's south pole. Lunar Mission One is a British-led, unmanned Moon probe launched in 2024. It attempts to land on the lunar south pole – a region largely unexplored until now – before drilling down at least 20m (65 ft) and trying to reach as deep as 100m (328 ft). This provides fresh new insights into the Moon's composition and geologic history, revealing new clues about the early Solar System. The mission gains crowdfunding through Kickstarter. Backers are able to contribute photos, text and even their DNA in a time capsule, leaving a digital record of civilization. Detailed analysis of the surface environment helps to gauge the suitability of the lunar south pole as a location for a permanent human base in future decades. Bioelectronics for treating arthritis are in common use. Arthritis is a form of joint disorder caused by trauma or infection of a joint, or old age. As of the 2010s, it was the single most common type of disability in the United States, predominantly affecting the elderly and resulting in over 20 million individuals having severe limitations in function on a daily basis. Total costs of arthritis cases were close to $100 billion annually, a figure expected to increase dramatically in the future with an aging population. Treatments for arthritis usually involved a combination of medication, exercise, and lifestyle modification, but a cure remained elusive. In 2014, a breakthrough involving the use of bioelectronics was unveiled by researchers. This took the form of a pacemaker-style device embedded in the necks of patients, firing bursts of electrical impulses to stimulate the vagus nerve – a crucial link between the brain and major organs. The impulses were shown to reduce activity in the spleen, in turn producing fewer chemicals and immune cells that would normally cause inflammation in the joints of patients with rheumatoid arthritis. Over half of people saw a dramatic improvement, even for severe symptoms, with up to 30% achieving remission. After successful clinical trials, another decade of progress led to next-generation implants miniaturized to the size of rice grains, as well as improvements in cost and efficacy. By 2024, it is a routine form of treatment in many countries. Bioelectronics are showing promise in other areas too. For example, they can prevent the airway spasms of asthma, control appetite in obesity, and help restore normal insulin production in diabetes. Carsharing has exploded in popularity. Carsharing is a model of car rental where people rent cars for short periods of time, often by the hour. It is attractive to customers who make only occasional use of a vehicle, as well as others who need access to a vehicle of a different type than they use day-to-day. While some firms had experimented with the concept in the late 20th century, it only became well established in the early 21st. From the 2000s onwards, a growing trend of flexible, multi-modal, on-demand mobility led to rapid expansion of carsharing services. By 2015, carshare programs were available on five continents, over 30 countries and in hundreds of cities worldwide. Rising urbanization, increasing problems of congestion and pollution, and the social and personal costs of private car ownership continued to drive demand for alternatives such as carsharing. New innovations included one-way carsharing services for shorter, spur-of-the-moment trips; automakers partnering with garage chains to give users free parking in city centers; ride-hailing mobile apps; the adoption of plug-in electric vehicles; and a small but growing number of self-driving vehicles. While the industry continued to expand in Europe and North America, most of the new growth was occurring in the Asia Pacific region, particularly China. In 2014, membership of carsharing programs stood at 2.4 million. By 2024, this has increased nearly ten-fold to reach 23.4 million while global revenue has risen six-fold, increasing from $1.1 billion to $6.5 billion. Wind turbine drone inspection is a multi-billion-dollar industry. As the world shifts towards clean energy, the number of wind turbines is growing exponentially. With so many installations, there is now enormous demand for inspection and maintenance of these structures. This is occurring alongside rapid uptake of drones and other unmanned aerial vehicles (UAV), which can provide a faster and cheaper alternative to traditional inspections. Until now, most of these jobs involved either simple ground-based visual assessments, or complicated and risky rope or platform access (sometimes at heights of 600 feet). By contrast, drones are essentially risk-free, extremely quick in their operations and offer much higher resolution than human eyes, while automating much of the image processing, data analysis and other tasks. By 2024, global revenue for wind turbine UAV sales and inspection services has reached almost $6 billion. Star link reaches full capacity. Star link is a new worldwide satellite broadband network created by SpaceX. It consists of more than 4,400 cross-linked satellites in the "smallsat" class (weighing just a hundred or so kilograms each), orbiting at an altitude of 1,100 km (680 mi). This is quadruple the number of active satellites that were operational a decade earlier, which is made possible because they are mass-produced at much lower cost per unit of capability than existing satellites. The SpaceX CEO, Elon Musk, initiated the project after noting a significant unmet demand for low-cost global broadband capabilities. Smaller satellites were seen as crucial to lowering the cost of space-based Internet and communications. The Star link network provides gigabit speeds at latencies of around 25ms, about as low as cable Internet service. Normally this would not be possible with satellites, but Star link would use low-Earth orbits. Flight testing began in 2017 with two prototype satellites – MicroSat-1a and MicroSat-1b. After communicating with ground stations, these experiments were followed by the network of 4,400 satellites, which began launching in 2019. This network reaches full capacity by 2024. Along with projects from rival companies such as One Web and Google, the Star link network helps to further expand the reach of the Internet, as well as increasing the average user's connection speed. Some years later, an even larger network of 7,500 satellites at lower altitude is developed by SpaceX to boost capacity and reduce latency in densely populated areas. Further into the future, a similar network is placed in orbit around Mars. The first probe to fly into the Sun's outer atmosphere. Parker Solar Probe (formerly known as Solar Probe Plus) is a historic mission flying into the Sun's outer atmosphere (corona) for the first time. The probe travels to within 5.9 million km (3.6 million miles) of the Sun's surface – just four times the length of its diameter. At such close range, an extremely strong shield is needed at the front of the spacecraft. This is made of reinforced carbon-carbon composite, able to withstand temperatures of 2000°C. At closest approach, the Parker Solar Probe hurtles around the Sun at approximately 450,000 miles per hour; fast enough to get from Philadelphia to Washington in one second. The mission's primary scientific goals are: To determine the structure and dynamics of the magnetic fields at the sources of solar wind. To trace the flow of energy that heats the corona and accelerates the solar wind. To determine what mechanisms, accelerate and transport energetic particles. To explore dusty plasma near the sun and its influence on solar wind and energetic particle formation. Coming closer to the Sun than any previous craft, the Parker Solar Probe uses a combination of in situ measurements and 3D imaging to revolutionize our knowledge of the physics, origin, and evolution of the solar wind. Its closest approach is in December 2024. COVID-19 vaccines are available to 90% of the world. COVID-19, caused by the SARS-CoV-2 virus, first came to the world's attention in December 2019 when an outbreak occurred in Wuhan, China. It soon spread to countries around the world, with WHO declaring a pandemic in March 2020. A highly contagious and airborne disease, with a mortality rate much greater than regular influenza, it became the most serious health crisis to face the world since the Spanish flu of 1918 – causing severe social disruption, mass cancellations and postponements of events, worldwide lockdowns, and the largest economic recession since the Great Depression. Previous work to develop vaccines against related diseases, SARS (2002) and MERS (2012), led to knowledge about the structure and function of coronaviruses – which accelerated development during early 2020 of varied technology platforms for a COVID-19 vaccine. Russian President Vladimir Putin announced that he had approved the world's first COVID-19 vaccine, named Sputnik-V, in August 2020. However, experts raised concerns about the speed of Russia's work, suggesting that researchers might be cutting corners. The WHO urged Russia to follow international guidelines and did not include the purported vaccine on its list of candidates for phase III clinical trials, which involve more widespread testing in humans. At least 321 candidate vaccines had officially begun development by October 2020, a 2.5-fold increase since April. Of these, 33 were in phase I–II trials and nine in phase II–III trials. North America led these efforts, being home to about 40% of the world's COVID-19 vaccine research projects, compared with 30% in Asia and Australia, 26% in Europe, and a few projects in South America and Africa. Amid growing concerns over "vaccine nationalism", a coalition of 165 countries agreed a landmark deal – known as COVAX – to enable the rapid and equitable distribution of any new coronavirus vaccines. This would ensure that each participating country would receive a guaranteed share of doses to vaccinate the most vulnerable 20% of its people by the end of 2021. The United States rejected the COVAX agreement, choosing instead to focus on its own public–private partnership known as "Operation Warp Speed" with nearly $10 billion allocated by Congress to develop, manufacture, and distribute hundreds of millions of vaccine doses by the end of 2020. As with Russia's Sputnik-V, many medical professionals viewed the timeline for Operation Warp Speed as unrealistic. Nevertheless, a number of promising breakthroughs in research and development occurred in late 2020 and early 2021, with several candidates completing late-stage testing. China, meanwhile – having initially been absent from COVAX – joined this global initiative in October 2020. This made it the single largest economy to participate in the alliance and helped to improve the country's image, given that it had been the epicenter of the initial outbreak in 2019, and came under heavy criticism for delays in its early response to the virus. By late 2020, China claimed to have several vaccines in advanced stages of R&D. In India, the Serum Institute – the world's largest vaccine producer by volume – partnered with organizations including the Bill & Melinda Gates Foundation and pledged to deliver 200 million doses for low and middle-income countries at a cost of $3 per dose. Europe had been among the regions hardest hit by the pandemic in terms of deaths per million people, with governments taking major steps to prepare for the introduction of vaccines. The UK, for example, signed deals with pharmaceutical companies to secure 340 million potential doses. Worldwide, the initial roll out of COVID-19 vaccines would generally be limited to healthcare workers and the most vulnerable in society (the elderly, immunocompromised and so on), with a focus on cities or regions experiencing the worst infection rates. Potential "super-spreaders" such as public transport workers and retail/supermarket employees would also be given a high priority as distribution continued. Progress with manufacturing and distribution of vaccines led to a substantial increase in availability from the second half of 2021 onwards. This continued alongside general improvements in the ability to track and contain outbreaks. However, producing enough doses to cover the entirety of the human race would take years rather than months, with a bottleneck in terms of production, and many candidate vaccines remaining at the trial stage. Billions of people gained vaccine treatments in 2022 and 2023 but the process required more time to be safely and efficiently implemented. By 2024, COVID-19 vaccines have finally been scaled up sufficiently that the vast majority (90%) of the world's 8 billion people have access to a treatment. The crisis has not ended, however. A significant number of infections continue to be reported daily, while many so-called recovered patients are left with a variety of lingering health issues. Large numbers of people are reluctant to be vaccinated – either due to basic skepticism about the need for vaccination, or a belief in conspiracy theories. Although more manageable now, COVID-19 remains present in the global population for decades to come.

2025: The ITER experimental fusion reactor is switched on. ITER (originally the International Thermonuclear Experimental Reactor) is a prototype fusion power plant constructed in France. The most complex machine ever created by humans, it aims to become the first project of its kind to demonstrate efficient and economic use of fusion reactions – the same processes occurring naturally within the Sun. While human-made fusion had been achieved briefly and on small scales in the past, these tests resulted in a net loss of energy. By contrast, ITER is designed to produce a plasma that releases the equivalent of 500 megawatts (MW) of power, during much longer pulses than any previous machine. Built at a cost of €22 billion (US$26.1 billion), over a period of nearly two decades, ITER is among the biggest science projects ever undertaken, second only to the International Space Station. This joint research experiment is funded by countries including China, the European Union (EU) members, India, Japan, Russia, South Korea, and the United States. To demonstrate net fusion power on a large scale, the reactor core is required to simulate conditions at the center of the Sun. For this, it uses a magnetic confinement device known as a tokamak. This doughnut-shaped vacuum chamber generates a powerful magnetic field that prevents heat from touching the reactor's walls. Tiny quantities of fuel are injected into and trapped within the chamber. Here they are heated to 100 million degrees, forming a plasma. At such high temperatures, the light atomic nuclei of hydrogen become fused together, creating heavier forms of hydrogen such as deuterium and tritium. This releases neutrons and a huge amount of energy. The ITER project began in 1988, with many years of conceptual and engineering studies prior to site preparation in 2007. Tokamak complex excavation commenced in 2010 and construction of the tokamak itself followed later in the decade. The assembly and integration phase got underway in 2018 with finishing of the concrete supports and bottom parts of the cryostat. The official announcement of "machine assembly" marked a key milestone in 2020 and this led to installation of the vacuum vessel (16 times as heavy as any previous fusion vessel) and central solenoid (the superconducting coil for producing 13.5 teslas). When completed in 2025, the ITER contains more than 10 million individual parts, weighing more than 25,000 tons and connected by 200 km (124 miles) of superconducting cables, all kept at –269°C (–452°F) by the world's largest cryogenic plant. The end of assembly is followed by operational activation and first plasma in 2025. The initial experiments pave the way to full deuterium–tritium fusion beginning in the mid-2030s. ITER achieves a Q-value of 10. In other words, the input of 50 MW results in an output of 500 MW; a substantial net gain in production of energy. ITER can sustain this in bursts of nearly 20 minutes. For comparison, the Joint European Torus (JET) in 1997 – the previous world record for peak fusion power – required 24 MW to produce an output of only 16 MW (a net loss and Q-value of 0.67), which lasted just a few seconds. The insights resulting from experiments at ITER lead to new ways of holding plasma in place at critical densities and temperatures. Further development and refinement of chamber designs, such as better superconducting magnets and advances in vacuum systems, improves the generation of power required for sustained commercial operations. This is demonstrated by ITER's successor, in the mid-21st century, which takes an input of 80 MW and produces an output of 2,000 MW (a Q-value of 25). In the second half of the 21st century, electricity produced by fusion becomes widely available – offering humanity a new and virtually unlimited supply of clean, green energy. A billion human genomes have been sequenced. DNA testing is now so cheap, fast, and routinely accessible that over a billion human genomes have been sequenced around the world. Back in 1990, when the first attempt was made to identify and map all 3.3 billion base pairs in a person – an effort known as the Human Genome Project – the cost of doing so ran into billions of dollars. The time required was over a decade and involved many scientists from all over the globe in what became the largest ever collaboration on a biological project. In the years following the completion of the Human Genome Project, tremendous improvements were made in sequencing times and costs. These new techniques allowed many more individuals to have their DNA read. The cost per genome fell by orders of magnitude – from $100 million by 2001, to under a million dollars by 2008, less than $10K by 2011 and just $1,000 by 2016. This was a trend even faster than Moore's Law. DNA sequencing began to enter the mainstream in the second half of the 2010s. In the United Kingdom, for example, the National Health Service (NHS) offered its first medical diagnoses via genetic testing in 2015 and three years later had completed the 100,000 Genomes Project. Similar initiatives were attempted in many other regions, as the benefits of large-scale health databases became clear. The increasing portability and availability of consumer testing kits, such as those offered by 23andMe, led to a further acceleration of this trend. Initially restricted to partial scans, it was now technically and financially viable to conduct whole genome sequencing to provide a full and complete analysis of an individual's DNA. As well as future health risks and personalized treatments, information could also be gleaned about their ancestry and family history. By 2025, a billion human genomes have been sequenced – about one-eighth of the world's population. The quantity of genomic data is now reaching into the exabyte scale, larger than the video file content of the entire YouTube website. This has created huge demand for improved storage capacities and led to a surge in cloud computing networks. The sheer volume and complexity of Big Data has made AI programs such as IBM's Watson far more commonly used for medical and research purposes. Among the latest discoveries are thousands of genes for intelligence, providing new insights and targets for the treatment of impaired cognitive abilities. With around 75% of a person's IQ attributed to genetic differences, these genes will play a role in creating super-intelligent humans in the more distant future. While great progress is now being made in genetics, there are privacy and security implications of so much health information being generated and stored online. Various hacking scandals involving theft and selling of personal data have made the news headlines recently. Insurance firms and others with vested interests, particularly in the U.S., are keen to exploit the treasure trove of medical information now available and have stepped up their lobbying efforts. There is growing concern about the injustice of genetic prejudice and discrimination. Human brain simulations are becoming possible. The first complete simulation of a single neuron was perfected in 2005. This was followed by a neocortical column with 10,000 neurons in 2008: then a cortical mesocircuit with 1,000,000 neurons in 2011. Mouse brain simulations, containing tens of millions of neurons, were later achieved. By 2025, the exponential growth of data has made it possible to form accurate models of every part of the human brain and its 100 billion neurons. Between 2000 and 2025, there was a millionfold increase in computational power, together with vastly improved scanning resolution and bandwidth. Much like the Human Genome Project, there were many in the scientific community who doubted that the brain could be mapped so quickly. Once again, they failed to account for the exponential (rather than linear) growth of information technology. Although it's now possible to scan and map a complete human brain down to the neuron level, analyzing the enormous volumes of data it contains and using that to fully understand its workings will take much longer. Nonetheless, this represents a major milestone in neurology and leads to increased funding towards various brain-related ailments. 3D-printed human organs. Additive manufacturing, also known as 3D printing, was first developed in the mid-1980s. Initially used for industrial applications such as rapid prototyping, it fell dramatically in cost during the 2010s and 2020s, becoming available to a much wider audience. Arguably the most transformative breakthroughs were occurring in health and medicine. Customized, 3D-printed body parts were saving peoples' lives and included artificial jaw bones, bioresorbable splints for breathing and replacement skull parts, among many other uses. Non-critical applications included dental implants and exoskeletons to assist with mobility and joint movement. Even greater advances were taking place, however. 3D printing was no longer limited to inorganic materials like polymers or metals. It was being adapted to construct living, biological systems. Layer after layer of cells, dispensed from printer heads, could be placed exactly where needed with precision down to micrometer scales. Initially demonstrated for simple components like blood vessels and tissues, more sophisticated versions later emerged in combination with scaffolds to hold larger structures in place. Eventually, the first complete organs were developed with sufficient nutrients, oxygen, and growth vectors to survive as fully functioning replacements in mouse models. By 2025 – after testing on animals – customized 3D-printing of major human organs is becoming feasible for the first time. Although yet to be fully perfected (as certain types of organs remain too complex), this is nevertheless a major boost for life extension efforts. In the coming decades, more and more of the 78 organs in the human body will become printable. Vertical farms are common in cities. With a total population fast approaching 8 billion, world food demand has continued to climb. At the same time, however, the increasingly dire effects of climate change, as well as other environmental factors, are now having a serious impact. Droughts, desertification, and the growing unpredictability of rainfall are reducing crop yields in many countries, while shrinking fossil fuel reserves are making large-scale commercial farming ever more costly. Decades of heavy pesticide use, and excess irrigation have also played a role. The United States, for example, has been losing almost 3 tons of topsoil per acre, per year. This is between 10 and 40 times the rate at which it can be naturally replenished – a trend that, if allowed to continue, would mean all topsoil disappearing by 2070. As this predicament worsens and food prices soar, the world is now approaching a genuine, major crisis. Amid the deepening sense of urgency and panic, a number of potential solutions have emerged. One such innovation has been the appearance of vertical farms. These condense the enormous resources and land area required for traditional farming into a single vertical structure, with crops being stacked on top of each other like the floors of a building. Singapore opened the world's first commercial vertical farm in 2012. By the mid-2020s, they have become widespread, with most major urban areas using them in one form or another. Vertical farms offer a number of advantages. An urban site of just 1.32 hectares, for example, can produce the same food quantity as 420 hectares (1,052 acres) of conventional farming, feeding tens of thousands of people. Roughly 150 of these buildings, each 30 stories tall, could potentially give the entire population of New York City a sustainable supply of food. Genetically modified crops have increased in use recently and these are particularly well-suited to the enclosed, tightly controlled environments within a vertical farm. Another benefit is that food can then be sold in the same place as it is grown. Farming locally in urban centers greatly reduces the energy costs associated with transporting and storing food, while giving city dwellers access to fresher and more organic produce. Another major advantage of vertical farming is its sustainability. Most structures are primarily powered on site, using a combination of solar panels and wind turbines. Glass panels coated in titanium oxide cover the buildings, protecting the plants inside from any outside pollution or contaminants. These are also designed in accordance with the floor plan to maximize natural light. Any other necessary light can be provided artificially. The crops themselves are usually grown through hydroponics and aeroponics, substantially reducing the amount of space, soil, water, and fertilizer required. Computers and automation are relied upon to intelligently manage and control the distribution of these resources. Programmed systems on each level control water sprayers, lights, and room temperature. These are adjusted according to the species of plant and are used to simulate weather variations, seasons, and day/night cycles. Some of the more advanced towers even use robots to tend to crop. Excess water lost through evapotranspiration is recaptured via condensers in the ceiling of each level, while any runoff is funneled into nearby tanks. This water is then reused, creating a self-contained irrigation loop. Any water still needed for the system can be filtered out of the city's sewage system. Vertical farms also offer environmental benefits. The tightly controlled system contained in each structure conserves and recycled not just water – but also soil and fertilizers such as phosphorus, making the total ecological footprint orders of magnitude smaller than older methods of agriculture. On top of that, the reduced reliance on arable land helps to discourage deforestation and habitat destruction. Vertical farms can also be used to generate electricity, with any inedible organic material transformed into biofuel, via methane digesters. Solid waste is reaching crisis levels. Solid waste has been accumulating in urban areas and landfills for many decades. Poor funding for waste disposal and lack of adequate recycling measures, together with population growth and associated consumption have ensured a never-ending rise in trash levels. The global output of solid waste has risen from 1.3 billion in 2012, to over 2.2 billion tons annually by 2025. The cost of dealing with this quantity of garbage has nearly doubled as well, rising to $375 billion annually. Developing nations, lacking the money and infrastructure to properly dispose of their trash, face the greatest crisis, with solid waste increasing five-fold in some regions. Public health is being seriously affected since groundwater is becoming more and more polluted as a result. E-waste is proving to be even more damaging. In India, for example, discarded cellphones have increased eighteen-fold. Rapid advances in technology, ever-more frequent upgrades to electronic products, and the aspiration for Western lifestyles have only exacerbated this situation. Developed nations are better able to handle the problem, but since only 30% of their waste is recycled it continues to build rapidly. Plastics are a particular problem, especially in oceans and rivers since they require centuries to fully degrade. As well as direct environmental damage, this waste is releasing large amounts of the greenhouse gas methane, which contributes to global warming. Public activism, though increasing at this time, has little effect in halting the overall trend. Kivalina has been inundated. Kivalina was a small Alaskan village located on the southern tip of a 7.5 mi (12 km) long barrier island. Home to around 400 indigenous Inuit, its people survived over countless generations by hunting and fishing. During the late 20th and early 21st centuries, a dramatic retreat of Arctic Sea ice left the village extremely vulnerable to coastal erosion and storms. The US Army built a defensive wall, but this was only a temporary measure and failed to halt the advancing sea. By 2025, Kivalina has been completely abandoned, its small collection of buildings disappearing beneath the waves. The Alaska region has been warming at twice the rate of the USA as a whole, affecting many other Inuit islands. At the same time, opportunities are emerging to exploit untapped oil reserves made available by the melting ice. Completion of the East Anglia Zone. The United Kingdom, one of the best locations for wind power in the world, greatly expanded its use of this energy source in the early 21st century – offshore wind in particular. With better wind speeds available offshore compared to on land, offshore wind's contribution in terms of electricity supplied could be higher, and NIMBY opposition to construction was usually much weaker. The United Kingdom became the world leader in offshore wind power when it overtook Denmark in 2008. It also developed the largest offshore wind farm in the world, the 175-turbine London Array. As costs fell and technology improved, various new projects got underway. By 2014, the United Kingdom had installed 3,700MW – by far the world's largest capacity – more than Denmark (1,271MW), Belgium (571MW), Germany (520MW) the Netherlands (247MW) and Sweden (212MW) combined. Growing at between 25 and 35 per cent annually, the United Kingdom's offshore wind capacity was on track to reach 18,000MW by 2020, enough to supply one-fifth of the country's electricity. The largest of these projects, known as "Dogger Bank", was built off the northeast coast of England in the North Sea. This gigantic installation featured 600 turbines covering an area the size of Yorkshire and generating 7,200MW from the early 2020s. Eight other major sites were being planned around the United Kingdom with potential for up to 31,000MW. Among the biggest of these other sites was the East Anglia Zone. This was divided into six separate areas, each with 1,200MW capacity for a combined total of 7,200MW – the same as Dogger Bank. Each turbine would have a rotor diameter of 200m, and a tip height up to 245m. The first stage received planning permission in 2014 and was operational by 2019, providing a clean, renewable energy source for 820,000 homes. The remaining five stages were approved between 2016 and 2020, followed by a similar schedule for construction. When fully completed in 2025, the whole East Anglia Zone would supply a total of four million homes. With ongoing concerns over energy and climate change, offshore wind capacity in the United Kingdom continued to grow rapidly in subsequent decades. Eventually it became integrated into a continent-wide "super grid" stretching across Europe. This was followed by "peak wind" in the late 21st century as the resources utilized offshore reached a theoretical maximum of 2,200 GW – though alternative energies such as fusion had arrived by then. The UK phases out coal power. As the world's first industrialized country, the United Kingdom had a long history of coal use. Even before the Industrial Revolution, there was some evidence of coal mining in ancient and medieval times. Stone and Bronze Age flint axes, for example, were discovered embedded in coal, showing that it was mined in Britain before the Roman invasion. The surge of coal mining in the 18th and 19th centuries was driven by demand for steam engines, the rapid expansion of the rail network and other industries throughout the Victorian period. Coal was widely used for domestic heating, due to its low cost and widespread availability. The manufacture of coke also provided coal gas, which could be used for heating and lighting. Coal production peaked in 1913 at 287 million tons. Until the late 1960s, coal was the main source of energy produced in the UK, peaking at 228 million tons in 1952. From the 1970s onwards, the UK became increasingly reliant on imports, which coincided with initiatives for cleaner energy generation. By the 2010s, only a dozen or so coal-fired power stations remained in the UK. One third of these were closed by 2016 to meet EU air quality legislation. As part of the ongoing drive towards cleaner energy, the UK Energy Secretary proposed that coal power should be phased out within 10 years. The last remaining coal power plants in the UK are shut down by the mid-2020s. The European Extremely Large Telescope is operational. This revolutionary new telescope is built in Cerro Armazones, Chile, by the European Southern Observatory (ESO), an intergovernmental research organization supported by fifteen countries. It has the aim of observing the universe in greater detail than even the Hubble Space Telescope. The main mirror is 39 meters (129 ft). This makes it powerful enough to directly image exoplanets, study their atmospheres, and potentially detect water and organic molecules in protoplanetary disks around stars. It can also perform "stellar archaeology" – measuring the properties of the first stars and galaxies, alongside probing the nature of dark matter and dark energy. Originally planned for 2018, the observatory was delayed until 2022 due to financial problems, then delayed again until 2025. The mirror is also reduced in size slightly, having previously been 42m. The Giant Magellan Telescope is fully operational. The Giant Magellan Telescope (GMT) is a major new astronomical observatory completed in 2025. Costing around $1 billion, this international project is led by the US, in partnership with Australia, Brazil, and Korea, with Chile as the host country. The telescope is built on a mountain top in the southern Atacama Desert of Chile with an altitude of 2,516 m (8,255 ft). This site was chosen as the instrument's location because of its outstanding night sky quality and clear weather throughout most of the year, along with a lack of atmospheric pollution and sparse population giving it low light pollution. The GMT consists of seven 8.4 m (27.6 ft) diameter primary segments, with a combined resolving power equivalent to a 24.5 m (80.4 ft) mirror. It has a total light-gathering area of 368 m sq (3,960 sq ft), which is 15 times greater than the older, neighboring Magellan telescopes. It is 10 times more powerful than the Hubble Space Telescope. The GMT operates at near infrared and visible wavelengths of the spectrum. It features adaptive optics, which helps to correct image blur caused by the Earth's atmospheric interference. The first of the seven mirrors were cast in 2005, with polishing completed to a surface accuracy of 19 nanometers, rms. By 2015, four of the mirrors had been cast and the mountain top was being prepared for construction. The GMT achieves first light in 2024, with full operational capability in 2025. It is just the latest in a series of major telescopes being constructed around this time, heralding a new era of higher resolution astronomy. Others include the Thirty Meter Telescope (2024), the European Extremely Large Telescope (2025), and the Square Kilometer Array (2027), in addition to numerous space-based observatories. This new generation of telescopes leads to huge advances in knowledge of the early universe, major new discoveries of Earth-like planets around other stars, and breakthroughs in understanding the mysterious dark matter and dark energy that influence the structure and expansion of the universe. The first test flight of the Skylon spaceplane. Until now, all spacecraft launching from Earth into space have used multiple stages. This has required jettisoning parts of a launch vehicle while in flight, in order to reduce weight. During the 2020s, however, a new reusable spaceplane is developed that can operate without the need for booster rockets, fuel tanks, engines, or other external components – instead, utilizing a single stage, hybrid jet/rocket system. Known as Skylon, the vehicle is designed by Reaction Engines Limited, a British aerospace manufacturer based in Oxford shire, England, with funding provided by the UK government, European Space Agency, and BAE Systems. The total program cost was projected to be £7.1 billion ($10.1 billion), with a unit cost of about £190 million ($270 million). BAE Systems acquired a 20% stake in the company during 2015, investing an initial amount of £20.6 million ($29.4 million) to develop the engine system. Skylon takes off from a specially strengthened runway. It uses a precooled jet engine (rather than scramjet) to reach speeds of Mach 5.5 (1,700 m/s) at 26 km (16 miles) altitude using oxygen in the atmosphere to "breathe". This provides a significant reduction in propellant consumption. It then closes the air inlet and operates as a highly efficient rocket to complete the remainder of its journey to orbit, 300 km (186 miles) above the Earth. This concept is known as the Synergetic Air-Breathing Rocket Engine ("SABRE"). Although its payload capacity is only 15 tons (about 1/3rd that of the Space Shuttle), each plane is cheaper (about 1/10th) and vastly more fuel efficient than earlier spacecraft, largely thanks to the reduced weight offered by the SABRE. After completing a mission, it reenters the atmosphere with its skin protected by a strong ceramic, landing back on the runway like a normal airplane. It then undergoes any necessary maintenance and is capable of flying again in just two days (compared to two months for the Space Shuttle). Ground-based tests of the SABRE engine commence in 2019. The first unmanned test flights were originally planned for 2020, but subsequently faced delays until 2025. Although initially crewless, the Skylon is later used to carry astronauts to and from space stations. Future versions are even capable of being adapted for space tourism, transporting up to 30 passengers in a purpose-built module and costing under $500,000 per person. Skylon is hailed as the biggest breakthrough in aerospace propulsion technology since the invention of the jet engine – revolutionizing access to space. It also leads to commercial airliners capable of travelling around the globe in under four hours. The first manned flights from Russia's new spaceport. Despite being a major space power, Russia for decades lacked its own proper independent space launch facility for manned flights. Instead, it was reliant on the Baikonur Cosmodrome in neighboring Kazakhstan – leased from the government of that nation until 2050, at a cost of $115 million per year. In 2011, construction began on the Vostochny Cosmodrome, a new spaceport located in the Amur Oblast region in Russia's Far East. This was intended to reduce Russia's dependency on Kazakhstan, enabling most missions to be launched from its own soil. The area devoted to this new infrastructure would be nearly 100 sq km (39 sq mi) with four separate launch pads, an airport, train station, academic campus, training and space tourism facilities, business centers and a town of 30,000 capacity for housing workers and their families. Ros’s cosmos had suffered a number of setbacks and launch failures in the 2000s and early 2010s, including the loss of its Phobos-Grunt probe. To address this issue and restore the nation's reputation in space, Vladimir Putin announced a major boost in funding; a budget of 1.6 trillion rubles ($51.8 billion or €39 billion euros) for 2013-2020, a far greater increase than any other space agency in the world. Nevertheless, the spaceport faced delays. The first manned flights had been scheduled for 2018 but were subsequently put back until 2025. Plans for the launch vehicle were also revised to incorporate a new craft with a two-stage, heavy-lift Angara A5B rocket, instead of the older Soyuz. Russia is now beginning a moon exploration program based on this modernized launch vehicle. High-speed rail networks are being expanded in many countries. By the mid-2020s, many countries have radically overhauled their rail transport infrastructure, or are in the process of doing so. In Spain, more than 10,000km of high-speed track has been laid, making it the most extensive network in the world. 90 percent of the country's population now live within 50 km of a bullet train station. In Britain, the first phase of a major high-speed rail line is nearing completion. This will travel up the central spine of the country – connecting London with England's next largest city, Birmingham. It will eventually be expanded to Manchester and the north. Trains will be capable of reaching 250 mph, slashing previous journey times. In Japan, Tokyo will soon be connected with Nagoya via superfast magnetic levitation trains. Tests conducted in previous decades showed that it was possible to build a railway tunnel in a straight route through the Southern Japanese Alps. The first generation of these trains already held the world speed record, at 581 km/h (or 361 mph); but recent advances in carriage design have pushed this still further, to speeds which are fast enough to compete with commercial airliners. Many other countries are investing in high-speed rail during this time, due to its speed and convenience. Even America – which for decades had neglected its rail network – is now making big. A comprehensive overhaul of the U.S. airspace system is complete. The final upgrades of the Next Generation Air Transportation System (NextGen) are completed this year. This has involved a complete overhaul of the existing air transport network. Many aspects of the National Airspace System (NAS) had been failing because of a reliance on largely obsolete technology. The navigation system, for example, which relied on ground-based radar beacons, was based on technology from the 1940s. NextGen brings pervasive upgrades and improvements to the entire system during the 2010s and early 2020s. This includes physical infrastructure as well as computer systems. Hundreds of new ground-based stations are built to allow satellite surveillance coverage of nearly the entire country. New safety and navigation procedures are introduced that markedly reduce flight times, while offering a more dynamic method of air traffic control. Advances in computer power and digital communication have produced what is now a far more integrated and efficient national system. One of the largest technical advances is the complete replacement of the previous radar navigation system with a modern, GPS-based version. This creates detailed, three-dimensional highways in the sky, and considers variations in topography and weather – enabling pilots to fly shorter, more precise routes. By 2018, this system was in place at every major US airport. Once on the runway, taxiing planes are guided by automated systems. These use data gathered on the position of every other plane and vehicle to present pilots and controllers with detailed, real-time traffic maps of the tarmac. Runway capacity is increased with the introduction of multiple take-off and landing pathways, as opposed to the older, single route approach. Overall, these upgrades offer substantial improvements in flight-times, air pollution and fuel consumption. Delays are reduced by nearly 40%, saving tens of billions of dollars. Over 1.4 billion gallons of fuel are saved, and CO2 emissions are cut by 14 million metric tons. These numbers will continue to improve steadily over the years. Aircraft themselves are evolving in form, function, and efficiency. A number of striking new designs have emerged with significant technological and environmental benefits. Railguns are in use by the U.S. navy. After years of research and development, railguns are now in common use on U.S. naval ships. Unlike traditional artillery, which create force with explosive materials, the railgun is powered entirely by electricity from the ship's grid. It works by storing up a supply of electrical power, using what is called a pulse-forming network, which is then converted to an electromagnetic pulse. This travels up the barrel along parallel tracks of magnetic rails, forcing the projectile out of the gun, away from the power source. The weapon is capable of firing an 18-inch metal projectile, itself equipped with complex internal guidance systems, over 100 miles at close to Mach 6. This is fast enough to set the air around the projectile ablaze, while delivering it to targets in mere minutes. Explosive rounds are unnecessary since the kinetic energy released upon impact yields more power than traditional bombs of much greater size. New rapid-fire systems allow for a launch rate of around ten per minute. A number of technical issues first had to be overcome to reach this point though. Advances in materials technology were required to keep the barrel from wearing out after repeated firings, while the projectiles needed to be outfitted in a way that protected internal guidance systems during launch. New cooling techniques also had to be introduced. The guns themselves originally required more electricity than standard naval ships could provide. This was overcome with advances in energy efficiency, along with ultra-dense storage batteries. In combat situations, the railgun offers major benefits. It has greater accuracy over extremely long ranges. It can be used as initial cover fire for marines landing on shore, or as a defense against incoming missiles and other threats. Ships armed with these hi-tech weapons are able to attack with virtual impunity, safe from almost any retaliatory strike. Railguns become widespread around the world in the 2030s, adopted by many other navies. This devastating form of weaponry provides a considerable advantage in modern conflicts. The global crowdfunding market reaches $100bn. Crowdfunding is a form of alternative finance that involves raising monetary contributions from a large number of people – usually online – to collectively fund a project or venture. It first emerged in the arts and music communities, before eventually spreading into other areas. The rise of social media allowed it to gain popular and mainstream use. In 2009, crowdfunding generated slightly under a billion dollars worldwide, but by 2016 this had expanded 20-fold. Some of the biggest platforms now available were GoFundMe, Indiegogo, Kickstarter, Patreon and Teespring. With even greater potential yet to be fully realized, crowdfunding saw ongoing, rapid growth in the late 2010s and into the 2020s. Further momentum was gained from the billions of new Internet users appearing online (from 1.7 billion in 2010 to 5 billion by 2020), with social media continuing to play a major role. China was now the largest market, representing half the global total, followed by the rest of East Asia. By 2025, the crowdfunding market has reached almost $100bn worldwide – roughly 1.8 times the size of the global venture capital industry a decade earlier. Crowdfunding enables creators to attain low-cost capital from people around the world, reaching untapped markets. It also creates a forum to engage with audiences in the production process via updates and sharing of feedback. Pre-release access to content, or the opportunity to beta-test products, can be offered to project backers as part of the funding incentives. Fraud is also reduced through standards-based crowdfunding platforms. The democratization of fundraising through crowdfunding is a major breakthrough for entrepreneurs and non-profit organizations, allowing them to outmaneuver larger companies and corporations. Some of the more ambitious projects being crowdfunded include satellites and space probes. BepiColombo arrives in orbit around Mercury. BepiColombo is a joint mission between the European and Japanese space agencies. It is only the third mission to study Mercury at close range and only the second to enter into orbit around the planet. Consisting of a rocket component and two science probes, the mission is launched in 2018. It performs a total of seven flybys around Earth, Venus, and Mercury before orbital insertion on 5th December 2025. It is the most comprehensive on-location study of Mercury ever performed, with 12 specific objectives: What can be learned from Mercury about the composition of the solar nebula and the formation of the planetary system? Why is Mercury's normalized density markedly higher than that of all other terrestrial planets, Moon included? Is the core of Mercury liquid or solid? Is Mercury tectonically active today? Why does such a small planet possess an intrinsic magnetic field, while Venus, Mars and the Moon do not have any? Why do spectroscopic observations not reveal the presence of any iron, while this element is supposedly the major constituent of Mercury? Do the permanently shadowed craters of the polar regions contain sulfur or water ice? Is the unseen hemisphere of Mercury markedly different from that imaged by Mariner 10? What are the production mechanisms of the exosphere? In the absence of any ionosphere, how does the magnetic field interact with the solar wind? Is Mercury's magnetized environment characterized by features reminiscent of aurorae, radiation belts and magnetospheric substorms observed at Earth? Since the advance of Mercury's perihelion was explained in terms of space-time curvature, can we take advantage of the proximity of the Sun to test general relativity with improved accuracy? The European contribution, Mercury Planetary Orbiter (MPO), studies the surface and internal composition, while the Japanese probe, known as the Mercury Magnetosphere Orbiter (MMO), analyses the magnetosphere and atmosphere. A new form of ion engine is used for the propulsion system. BepiColombo was originally planned for a 2014 launch with 2020 arrival at Mercury but faced a number of delays. The mission concludes in 2028.

2026: NASA's Psyche spacecraft makes orbital insertion. On 31st January 2026, a probe called Psyche enters orbit around the large metallic asteroid known as 16 Psyche. One of the most massive objects in the main asteroid belt – and the heaviest known M-type asteroid – 16 Psyche is over 200 km (120 mi) in diameter and contains about 1% of the mass of the entire belt. Psyche is an orbiter mission developed as part of NASA's Discovery Program. Its primary objective is to explore the origin of planetary cores and confirm whether 16 Psyche is the exposed iron core of a protoplanet, the remnant of a violent collision with another object that stripped off its outer crust. The spacecraft is launched in July 2022 aboard a SpaceX Falcon Heavy launch vehicle, with a Mars gravity assist at 500 km (310 mi) altitude in May 2023 before reaching its destination two and a half years later. It performs a comprehensive study of the asteroid's geology, shape, surface features, elemental composition, gravity, magnetism, and mass distribution – helping to increase the understanding of planetary formation and interiors. This information is also of potential use in future asteroid mining operations. The mission duration is 21 months, lasting until the final quarter of 2027, with a series of gradually decreasing orbits taking it closer and closer to 16 Psyche. The initial orbit starts at 700 km (435 mi) distance, while the final orbit is at 85 km (53 mi). In addition to its study of 16 Psyche, the spacecraft also tests an experimental laser communication technology called Deep Space Optical Communications – designed to increase spacecraft communications performance and efficiency by 10 to 100 times over conventional means. The laser beams from the spacecraft are received by a ground telescope at Palomar Observatory in California. A synthetic human genome is completed. In May 2010, scientists created the first artificial lifeform. Mycoplasma laboratorium was a new species of bacterium, with man-made genetic code originating on a computer and placed on a synthetic chromosome inside an empty cell. Using its new "software", the cell could generate proteins and produce new cells. In March 2016, the same research institute in the U.S. announced the creation of a minimal bacterial genome, known as JCVI-syn3.0, containing only the genes necessary for life, and consisting of 473 genes. A few months later, in June 2016, scientists formally announced "Human Genome Project - Write" (also known as HGP-Write), a ten-year extension of the Human Genome Project, to create a synthetic human genome. The original project – completed in 2003 – was the largest ever collaboration in biology and involved hundreds of laboratories, taking 13 years of work. It led to major developments in genomic-based discovery, diagnostics, and therapeutics. Whereas the original project (HGP-Read) was intended to "read" DNA to understand its code, the HGP-Write project would use the cellular machinery provided by nature to "write" new code, producing vast DNA chains. The bacterial genome created in 2016 had 531,000 DNA base pairs and 473 genes. By contrast, the HGP-Write project would be orders of magnitude larger and more complex, with three billion base pairs and 20,000 genes. However, the earlier work on bacterial genomes had paved the way for new tools and semi-automated processes for whole genome synthesis. HGP-Write would cut the costs of engineering and testing large genomes in cell lines by more than 1,000-fold within ten years. Alongside this, an ethical framework for biological engineering was being developed. Longer term, the project would lead to transformative applications. Previously, the capability to construct DNA sequences in cells was mostly limited to a small number of short segments, restricting the ability to manipulate and understand biological systems. After the completion of HGP-Write, the ability to synthesize large portions of the human genome leads to major advances – in medicine, agriculture, energy, and other areas – by connecting the sequence of bases in DNA with their physiological and functional behaviors. Some health applications that arise from HGP-write include the growing of transplantable human organs, engineering of immunity to viruses in cell lines, engineering cancer resistance in cell lines, and enabling high-productivity vaccines at low cost. HGP-Write involves taking synthetically constructed DNA to produce a human genome able to power a single cell in a dish. In the more distant future, however, this area of biology advances to the point where entire synthetic people can be designed from scratch – new custom-made "super humans" able to resist all disease infections or made immune to the radiation and vacuum in space, for example. This leads to profound ethical questions about the nature of life. Aquaculture provides the majority of the world's seafood. Aquaculture – the cultivating of freshwater and saltwater fish under controlled conditions – has remained one of the fastest growing industries in the agricultural sector. Since the late 1980s, traditional "capture" fisheries have been on a plateau. Aquaculture, by contrast, increased by 8.8% per year from 1985 to 2010 and had witnessed an eightfold increase by the mid-2020s. It now accounts for the majority of the world's seafood, surpassing wild catch harvests by weight. The capture fishing industry itself has faced severe problems. Overfishing, climate change and pollution have all contributed to the sharp decline of yields. Numerous regions have experienced near-collapse or total collapse and will take decades to repopulate. Examples include the UK cod and Chilean jack mackerel fishing industries. The largest centers for aquaculture remain in East and Southeast Asia – with the Philippines, Cambodia, Vietnam, Thailand, and Indonesia seeing large increases in production. Cambodia in particular has seen massive growth. New techniques have been adopted, helping to increase both sustainability and yield. One such method, used for the cultivation of jumbo shrimp, is super-intensive stacked raceways. Shrimp are grown in large, enclosed tubes called raceways, in which computers monitor and control a steady circulation of mineral water. As they mature, they are moved down the stacked columns of tubes, until they reach the final bottom row, fully grown, where they are harvested. This method greatly increases the output of shrimp farms, up to one million pounds of shrimp per square acre and can be deployed almost anywhere. Water usage is lowered significantly. This method helps to alleviate the myriad of environmental damages traditional shrimp farming brings to the environment. Another method being utilized is land-based, closed-loop recirculating aquaculture systems. These indoor systems recycle around 98% of their water, with little-to-no discharge back into the environment. The risk for disease in a closed-loop system is essentially zero and minimizes the use of chemicals or antibiotics. Being entirely independent from any particular environment, these type of fish farms can be built anywhere, no matter the distance from any major body of water. The growth of aquaculture has caused a major shift in commerce and trade. Countries previously reliant on imports are now capable of producing vast quantities of fish, crustaceans, seaweed, and other seafood. Countries with dwindling natural fisheries benefit, now being able to produce as much or even more than can be caught from lakes or the ocean. Numerous startup companies have appeared to fill the growing industry. Aquaculture as a whole will become one of the most vital industries in the world this century, as traditional commercial fishing breaks down and produces unsustainable yields. The High Luminosity Large Hadron Collider (HL-LHC) is operational. The High Luminosity Large Hadron Collider (HL-LHC) is a major upgrade of the Large Hadron Collider (LHC) that is completed by 2026. This new design boosts the machine's luminosity by a factor of between five and seven, allowing 10 times more data to be accumulated, providing a better chance to see rare processes, and improving statistically marginal measurements. Luminosity is a way of measuring the performance of an accelerator: it is proportional to the number of collisions that occur in a given amount of time. The higher the luminosity, the more data that can be gathered during an experiment. The HL-LHC can perform detailed studies of the new particles observed at the LHC, such as the Higgs boson. It enables the observation of rare processes that were inaccessible at the previous sensitivity levels. More than 15 million Higgs bosons can be produced each year, for example, compared to the 1.2 million produced in 2011-2012. The development of the HL-LHC depends on several technological innovations that are exceptionally challenging to researchers – such as cutting-edge Tesla superconducting magnets, very compact and ultra-precise superconducting cavities for beam rotation, and 300-metre-long high-power superconducting links with zero energy dissipation. Together, these upgrades help to advance and further refine the knowledge already gained from the Higgs boson and provide fresh insights into so-called "New Physics", a more fundamental and general theory than that of the Standard Model. The International Linear Collider is completed. This project is the culmination of more than 25 years of concerted international efforts, with funding and research from Europe, Asia, and the Americas. Over 300 universities and laboratories have taken part. It originated as a series of three separate collider proposals – the Next Linear Collider (NLC), the Global Linear Collider (GLC) and the Teraelectronvolt Energy Superconducting Linear Accelerator (TESLA) – all of which were combined into the International Linear Collider (ILC). Located in Europe, the ILC is the successor to the Large Hadron Collider (LHC), building upon the work already done by that machine. Although its collisions are less powerful, it offers far more precise measurements. It also gives off less electromagnetic radiation. The ILC consists of two opposite-facing linear accelerators, together stretching 31 km (19.3 miles), that hurl particles and anti-particles towards each other at close to the speed of light. Along with the linear accelerators, the facility contains two dampening rings, with a circumference of 6.7 km (4.2 miles). Energy levels of the collisions are initially 500 billion-electron-volts (GeV) but are soon upgraded to a trillion-electron-volts (TeV). The extreme precision and exact recordings offered by the ILC help to reveal some of the deepest mysteries of the universe. Some experiments are concerned with extra-dimensional physics and supersymmetric particles, while others provide research into dark matter. Originally planned for completion in 2019, the ILC faced considerable delays due to funding, technical issues, and international agreements. It is finally ready by 2026. 3-D printed electronic membranes to prevent heart attacks. Following years of clinical trials – initially in rabbits and later in humans – a new device is available that can dramatically improve the monitoring and treatment of cardiac disorders. This consists of an ultra-thin membrane, specially customized and 3-D printed to exactly match the patient's heart shape. Tiny sensors embedded in a grid of flexible electronics measure pulse, temperature, mechanical strain and pH level with far greater accuracy and detail than was possible using previous methods. Doctors can determine the heart's overall health in real-time and predict an impending heart attack before a patient has any physical signs – intervening when necessary to provide therapy. The device itself can deliver a pulse of electricity in cases of arrhythmia. This electronic membrane can be installed in a relatively non-invasive procedure, by inserting a catheter into a vein beneath the ribs and then opening the mesh like an umbrella. At present, it is restricted to the exterior surface of the heart. However, new, and more advanced versions are now being developed that will go directly inside the heart to treat a variety of disorders – including atrial fibrillation, which affects 2.5 million U.S. adults and 4.5 million people living in the EU, accounts for one-third of hospitalizations for cardiac rhythm disturbances and is a major risk factor for stroke. Great progress is now being made in the monitoring, diagnosis, and treatment of heart disorders, thanks to this and other breakthroughs emerging at this time, all of which are contributing to a rapid decline in mortality rates. By the 2040s, deaths from cardiovascular disease will reach negligible levels in some nations. Youthful regeneration of aging heart muscle via GDF-11. In the previous decade, researchers identified an obscure blood protein called GDF-11. This was shown to have regenerative properties upon the cardiac muscle in age-related diastolic heart failure. The substance was found to be present at high levels in youth, and lower levels in old age. When elderly mice were supplemented with increased GDF-11, it had a dramatic effect on their hearts – restoring heart size and muscle wall thickness to a much earlier state. This offered a potential way of treating heart failure and aging in people. A series of clinical trials, beginning in the late 2010s, confirmed this. By 2026, it's becoming fairly routine for doctors to repair cardiac damage and restore human hearts to earlier states, based on the GDF-11 protein. Along with stem cells and other advances this decade, science is gradually chipping away at the factors which cause people to die. New treatments for Alzheimer’s disease. Alzheimer's is the most common form of dementia. This incurable, degenerative and terminal disease affects over 27m people worldwide, mostly aged over 65. The most common symptom is the inability to acquire new memories and difficulty in recalling recently observed facts. As the disease advances, further symptoms include confusion, irritability and aggression, mood swings, language breakdown, long-term memory loss, and the general withdrawal of the sufferer as their senses decline. Bodily functions are gradually lost, ultimately leading to death. Until recently, the precise mechanisms behind the illness were poorly understood. In 2011, however, genes were identified that played a key role in biological pathways such as inflammation, cholesterol, and cell transport systems. These provided new targets for potential treatments in the form of drugs, behavioral changes, and other therapies. New ways of delivering drugs to the brain were also found, such as using the body's own exosomes as carriers. After 15 years of research and clinical trials, the risk of developing the disease has now been cut by over 60%. With a better roadmap to guide progress with the remaining genes and biological processes, there is now real hope of actually curing the disease in the 2030s. Russia debuts its first reusable rocket. In the late 20th century, NASA developed the first reusable launch vehicle to reach orbit – the Space Shuttle. However, this failed to accomplish the intended goal of reducing launch costs to below those of expendable launch systems. During the early 21st century, commercial interest in reusable launch systems grew considerably, with several active launchers. SpaceX's Falcon 9 rocket featured a reusable first stage and capsule, while Virgin Galactic flew reusable suborbital spaceplanes, and the suborbital Blue Origin New Shepard rocket had recoverable first stages and crew capsules. Eager to join this race and become more competitive, Russian space agency Ros’s cosmos announced in 2020 that it would develop a preliminary design for the Amur-SPG, its first rocket with reusable capability. The maiden flight of the Amur occurs in 2026 at Vostochny Cosmodrome in part of eastern Russia known as the Amur region (hence its name). It bears a striking resemblance to the Falcon 9 – such as the stabilizing grid fins on the rocket's first stage, and the intended technical specification of reusing each booster up to 100 times. However, the Amur is smaller and less powerful than the Falcon 9, standing only 55 m (180 ft) tall, and with a capacity of 10.5 metric tons to low-Earth orbit (LEO). By contrast, the Falcon 9 is 70 m (230 ft) tall and can lift 22.8 metric tons to LEO. The total cost of developing the Amur-SPG system is 70 billion rubles (about US$900 million), with a per-launch cost of $22 million. This is cheaper than both a brand new ($60m) and reused ($50m) Falcon 9, albeit with a smaller payload. Landing platforms are installed on the coast of the Sea of Okhotsk (north of Japan) with vertical, powered landings of the second stage followed by collection and return to the cosmodrome, either by a heavy Mi26 transport helicopter or by rail. The Amur-SPG's relatively small size and payload makes it limited to niche markets, at least initially. However, a larger and more advanced derivative emerges in the 2030s with greater economy of scale, and full reusability for both stages. Combined with additional new rockets being developed by other space agencies and private companies, this contributes to an ongoing decline in launch costs to low Earth orbit. Global reserves of indium are running out. Indium is a rare, soft, and malleable post-transition metal, found primarily in zinc ore. It is mined almost exclusively in Canada, China, the US, and Russia. Indium is used in various electronic applications such as LCDs and touchscreens, solar cells, LEDs, and various batteries. It is also useful in making alloys, medical imaging, and in the control rods of nuclear reactors. Its role in electronic screens drives most of the production demand, which by now has resulted in global reserves being almost completely exhausted. Recycling is one option being pursued to solve this problem, but it will only suffice in the short term. Fortunately, new alternative materials are being introduced, derived from carbon nanotube compounds that can take on the role previously filled by indium. Italy hosts the Winter Olympics. The 2026 Winter Olympics take place from 6th February to 22nd February 2026, in the Italian cities of Milan and Cortina d'Ampezzo. Italy had beaten another joint bid from Swedish cities Stockholm–Åre, by 47–34 votes, at the 134th Session of the International Olympic Committee (IOC) held in Lausanne, Switzerland, on 24th June 2019. This marks the fourth time that the Olympic Games have been won by Italy, the first time they are hosted in Milan, and is also the first Olympic Games featuring two host cities in its name. It occurs on the 20th anniversary of the 2006 Winter Olympics in Turin and the 70th anniversary of the 1956 Winter Olympics in Cortina d'Ampezzo. Construction of the Sagrada Família is complete. The Sagrada Família is a massive, privately funded Roman Catholic church that has been under construction in Barcelona since 1882. Considered the masterwork of renowned Spanish architect Antoni Gaudí (1852–1926), the project's vast scale and idiosyncratic design have made it one of Spain's top tourist attractions, visited by millions of people each year. Construction of the building is finally completed this year, the 100th anniversary of Gaudí's death. Robotic hands matching human capabilities. As part of the on-going rise of consumer-level robotics, recent research in artificial intelligence and bio-inspired devices has reached a new plateau of possibilities. Modern robots are now able to fill an increasingly broad scope of roles in both home and work environments. Easily one of the most important (and difficult) abilities for such machines is being able to recognize and interact with various physical objects. For simple or repetitive tasks, such as assembly line production, this knowledge was relatively straightforward, requiring simple programming and mechanical systems. However, the growing complexity of environments that commercial robots now have to encounter has driven research into more intricate and capable mechanisms. As has often been the case, engineers turned to the human body itself to model both the form and function of new robot apparatuses. Since almost all robots must interact with and handle physical objects in some way, among the most commonly emulated body parts is the hand. Along with their associated computer programs and visual recognition software, robotic hands in the 2000s and 2010s had already boasted some impressive abilities. They could pick up delicate objects, catch objects thrown to them, make a range of gestures, fold towels, pour drinks and even prepare meals. Despite this, the sheer dexterity and flexibility of the human hand and the practical limits of mechanical components prevented scientists from achieving a perfect recreation. By the second half of the 2020s, however, the techniques involved have become sufficiently advanced to overcome most of the obstacles faced in previous decades. Around this time, some of the first robot hands equaling the capabilities of human hands are appearing in the laboratory. Advances in nanotechnology, miniaturization and micro-electronics have allowed engineers to account for almost all of the subtle movements performed by a living biological hand. Graphene-based actuators converting electricity into motion, artificial skin, tactile sensors, flexible electronics, and various other features are employed to emulate the real thing. This has also been the result of an improved biological understanding of how humans manipulate objects. AI programs, using precise visual perception software, are able to recognize countless physical objects and intelligently plan for how they can be manipulated. The robotic hand is therefore able to function autonomously and self-adjust to different objects based on texture, weight, and shape. All of this can be accomplished in fluid, natural movements that are largely indistinguishable from those of a real hand. Though still in the trial stage, such systems will prove extremely useful in the development of human-like robots and androids. By the following decade, the subtle capabilities offered by robotic hands will allow machines to interact with humans and their environment in myriad new ways. The FIFA World Cup is hosted jointly in Canada, Mexico, and the United States. The 23rd FIFA World Cup, held in the summer of 2026, is the first tournament hosted by more than two countries. It takes place in Canada, Mexico, and the United States, who won the rights to host the contest at the FIFA Congress in Moscow on 13th June 2018, beating a rival bid from Morocco. The 2026 tournament is the biggest World Cup ever held, after FIFA approved an expansion from 32 teams to 48. A total of 80 matches are played – 60 in the US (including all eight matches from the quarterfinals onward), while Canada and Mexico host 10 matches each. The final takes place at the 85,000-capacity MetLife Stadium in East Rutherford, New Jersey. Mars Science Laboratory is shutting down. This 900 kg (2,000 lbs.), six-wheeled rover has been transmitting back to Earth since 6th August 2012, the day it touched down on Mars. Although its planned mission duration was around two years, it continued to be operational for considerably longer, like the previous rovers, Spirit and Opportunity. In fact, its onboard plutonium generators carried enough heat and electricity to last 14 years. By 2026 the machine is finally grinding to a halt. The last signal is received from the rover this year. The first uncrewed exploration of a lunar lava tube. In 2026, British company Space bit conducts the third in a series of missions to the Moon using Asagumo robots. These walking, spider-like machines had previously been deployed to gather data on the presence of ancient lava tubes. With a suitable site having been identified, this latest study involves the robots actually venturing inside a tunnel. Here, they make detailed 3D scans of a cave entrance and portions of the interior wall. The data is relayed back to Earth via an accompanying "mothership" rover. In addition to providing scientific evidence about past volcanic activity, lunar crust composition, and stratification that occurred in the Moon's deep interior, the knowledge gained is of use in planning future human colonies. 50TB hard drives. During the 2010s, solid state drives (SSDs) became the preferred choice for running computer operating systems and applications, due to their much greater speed than traditional spinning hard drives. However, the latter still had a role to play in archiving/backups and general storage. As such, they did not disappear and continued to form a part of computing ecosystems. Ongoing research and development led to huge capacities, with 20 terabytes (TB) emerging by the start of the next decade, based on shingled magnetic recording (SMR) technology. An even greater innovation – heat assisted magnetic recording (HAMR) – boosted capacities still further. This technology involved tiny spots on the drive platter being heated to 450°C (842°F) and then cooled back down to room temperature in less than a nanosecond. During this process, the spots would become more receptive to magnetic effects, allowing data to be written to much smaller spaces than with conventional magnetic recording (CMR). Additionally, new drives with multiple actuators enabled the vast volumes of data to be read at speeds matching or exceeding current HDDs, making them practical for everyday use. Initially starting with around 20TB, these new HAMR drives quickly expanded in capacity over the next several years, reaching 50TB by 2026. Although the gap is closing in terms of worldwide byte shipments, conventional hard drives continue to lead over SSDs for some time to come, thanks to their affordability and greater capacities. The global "datasphere", or the amount of digital data worldwide, has increased from 33 zettabytes in 2018 to over 200 zettabytes by 2026 and continues to grow exponentially. The Hinkley Point C nuclear power station is operational. The UK's first commercial nuclear reactor began operating in 1956 and, at the peak in 1997, 26% of the nation's electricity came from nuclear power. In the early 21st century, however, many of these aging reactors were being retired and the share had declined to 19% by 2012. Of the remaining nine plants – with a combined capacity of 9,000 MW – eight were due for closure by the early 2020s. Not only that, but coal power stations needed replacing too. The UK faced the prospect of losing two-thirds of its electricity by 2030 without major investment to improve its energy infrastructure. In 2011, the government announced plans for a new fleet of nuclear power stations. All would be constructed at or near the existing nuclear power sites to minimize disruption. The first of these was Hinkley Point C, proposed next to Hinkley A and B, two older stations. This was approved in 2013, with construction starting in 2018. Two new reactors would be installed with a combined capacity of 3,200 MW – enough to supply an area twice the size of London, accounting for nearly 10% of the UK's electricity demand. The project – funded by a consortium of French and Chinese investors, including EDF Group – came under heavy criticism for its high costs, especially when compared to the latest solar PV and wind farms. Initially expected to be £14bn ($19.3bn), this had mushroomed to £23bn ($31.7bn) by 2021. EDF had agreed a price of £92.50 per megawatt hour (MWh) for electricity produced at Hinkley Point C. However offshore wind developers building the latest turbine arrays in waters off the UK's coast were offering less than £40/MWh by 2021. Hinkley Point C also faced considerable delays in construction. Originally planned for connection to the national grid by 2017, the developers pushed this schedule back again and again. The facility is finally operational by 2026, the first new fission power plant in the UK since 1995. It has an operational lifetime of 60 years.

2027: The BRICs overtake the G7. By this date, the major emerging markets – Brazil, Russia, India, and China, a.k.a. the BRICs – have overtaken the combined GDP of the G7 nations. Light-duty hydrogen fuel cell vehicles reach 1 million in annual sales. The first vehicles powered by hydrogen fuel cells emerged during the Cold War Space Race. These were bulky, inefficient, and expensive, however. It was not until the dawn of the 21st century that car manufacturers began to take the concept more seriously. In 2003, President George Bush proposed the Hydrogen Fuel Initiative (HFI), which was later implemented by legislation through the 2005 Energy Policy Act and the 2006 Advanced Energy Initiative. This aimed to further develop hydrogen fuel cells and infrastructure with the goal of producing commercial fuel cell vehicles. By 2008, over $1 billion had been contributed to the project. However, the U.S. Department of Energy later shifted its interest from fuel cells to battery vehicles. High costs and the lack of infrastructure were major problems, compounded by the challenge of deploying fuel stations over such a large geographical area. This led U.S. automakers to delay their hydrogen vehicle rollouts. The situation was different in Europe, however – with less geography to cover – and where governments had serious plans to develop the required infrastructure. Japan and South Korea, too, had similar initiatives getting underway. In addition to expanded infrastructure, further advances in technology reduced the size, weight, and cost of hydrogen fuel cell vehicles. By 2027, global sales of these non-polluting vehicles have reached one million annually for the first time. Although still only a tiny proportion of total, overall vehicle sales, the industry is now entering a period of explosive growth. Tokyo and Nagoya are connected by high-speed maglev. Two of Japan's largest cities – Tokyo and Nagoya – are now connected by the Chūō Shinkansen – a high speed maglev route. This 178-mile (286 km) line runs beneath the Japanese Alps (Akaishi Mountains), at speeds of up to 313 mph (505 km/h), enabling journey times of just 40 minutes. Built by the Central Japan Railway Company, at a cost of 9 trillion yen ($115bn), the route will be extended to Osaka by 2045. Carbon sequestration is underway in many nations. Following years of research and development, various new techniques are now being utilized for trapping and removing CO2. This is offering fresh hope for mitigating the effects of climate change. The most significant technology is "clean coal", being fitted to power plants. This is seeing widespread adoption since it now costs less than unsequestered coal-based power generation. The carbon dioxide is stored in geological formations deep underground (including some empty oil wells). Great care and precision must be taken in choosing these sites, however, as dumping the gas in an unstable location may cause it to leak back up to the surface or contaminate aquifers used for drinking supplies. Another method of carbon sequestration which is showing great potential is the deployment of "artificial trees". These are shaped like giant fly swatters around 10m high, and have become an increasingly common sight along roads, freeways, and other polluted areas. The trees capture CO2 through a filter system – thousands of times more efficiently than real trees – which is then removed and stored. Another project involves strips of algae, fitted to the sides of buildings, which naturally absorb CO2 through photosynthesis. They are most common in high-density urban centers, where tall buildings offer a much greater surface area. These "photobioreactors" not only sequent carbon but can also produce biofuel and biochar as beneficial side effects. The biofuel can be used to generate energy whilst keeping net carbon emissions to zero, while the biochar can be used as a very good fertilizer. Yet another project is the addition of highly reflective panels on rooftops. These reflect sunlight back into space, reducing the amount of solar radiation being absorbed by the Earth. Although efficient, the various techniques described above (and others) do not represent the ultimate solution to global warming. The only effective, long-term process for stabilizing the climate is the adoption of solar, wind, hydro, nuclear and other renewable energy sources. Thankfully, almost all developed countries now have legally-binding commitments in place for reducing CO2 emissions and have begun large-scale practical measures. Britain, for example, has cut its carbon dioxide pollution by 50% compared to 1990 levels, thanks to legislation enacted in 2011. The Venera-D mission arrives at Venus. Venera-D is a Russian space probe sent to study the atmosphere and surface of Venus. The primary aim of the mission is to understand the history and evolution of the planet. Venera-D was first proposed to the Russian Academy of Sciences in 2003, with a planned launch date of 2013. The original design featured a large orbiter, sub-satellite, two balloons, two small landers and one larger, longer-lived lander. However, it was scaled down and delayed until the late 2020s. The final configuration would have two components: one orbiter and one lander. The spacecraft is launched in 2026, arriving in 2027. The orbiter includes several spectrometers, a plasma package and camera. Its radar remote-sensing equipment is far more powerful than the Venera 15 and 16 probes of the 1980s and the NASA Magellan in the 1990s. A wealth of new data is gathered to characterize the composition of the atmosphere more accurately, clouds and their structure, radiative balance and the nature of the greenhouse effect, ionosphere, magnetosphere, and electrical activity, along with the gas escape rate in the upper atmosphere. This reveals new insights into the early history of Venus and the oceans of water it held in the ancient past. Knowledge is also gained about the nature of "super-rotation" – a phenomenon in which the atmosphere circles the planet in just four Earth days, far faster than the sidereal day of 243 days – but decreases at lower altitudes, so that wind speeds barely reach 10 km/h (6 mph) on the surface. In addition to the orbiter is a lander, which investigates one of the planet's tesserae. These are regions of very old, heavily deformed terrain characterized by intersecting tectonic elements, high topography, and high radar backscatter. This becomes the first probe to successfully touch down on Venus's surface since the Soviet Union's Vega 2 mission, 42 years earlier in 1985. It features a high-resolution camera to obtain panoramic images of both its descent through the atmosphere and the surroundings at ground level. The lander investigates the structure and chemical composition of the atmosphere and performs chemical analysis of surface materials. The interaction between the surface and the atmosphere is studied. The lander also characterizes the geology of local landforms at different scales. Due to the hellish conditions on Venus (temperatures of 462 °C, atmospheric pressure 92 times that of Earth, and clouds of sulphuric acid), the lander was originally expected to last for only one or two hours. However, with new materials and more robust electronic systems, it is redesigned to survive longer than any previous mission. This is helped in part by a collaboration with NASA, who supply a number of components. Venera-D is the first in a new generation of Russian probes to Venus. The mapping data obtained by the orbiter is used to determine the location of potential future landing sites. These follow-up missions also include sensors designed to search for signs of life in the mid-level atmosphere, which is seen as a possible habitat for extremophile organisms. The asteroid 1999 AN10 makes a close approach. 1999 AN10 is an asteroid of the Apollo group, a collection of Earth-crossing bodies, many of which are large enough and can drift close enough to Earth to be considered potentially hazardous. It was discovered by U.S. scientists on 13th January 1999. The object was estimated to have a diameter of up to 1,800 m (5,905 ft), or about a mile – enough to cause continent-scale devastation, should an impact occur on Earth. AN10 circles the Sun every 643 days and twice each year passes through the Earth's local neighborhood. On 7th August 2027, it makes a particularly close approach as it comes within just 0.0026 AU (390,000 km; 240,000 mi), about the same as the distance between the Earth and Moon. The asteroid reaches a peak apparent magnitude of 7.3, bright enough to be visible in binoculars. Its orbit remains dangerously close for the next 600 years. The Square Kilometer Array begins science operations. Humanity's view of the universe is greatly expanded with the completion of a major new observatory. The Square Kilometer Array (SKA) is a radio telescope with a combined collecting area of approximately one kilometer. It operates over a wide range of frequencies and its size makes it 50 times more sensitive than any other comparable instrument. By utilizing advanced processing technology, it can survey the sky more than 10,000 times faster than ever before. With additional stations extending to a distance of 3,000 km from a concentrated central core, it continues radio astronomy's tradition of providing the highest resolution images in all of astronomy. First light is achieved in 2027 and the telescope is fully operational by 2030. The autopsy report for Elvis Presley is made public. Elvis Presley was one of the 20th century's most iconic singers, often referred to as the "King of Rock and Roll", or simply "the King". After cementing his place among the legends of American music, his final years were marked by serious health problems. These were aggravated, and possibly caused by, drug dependence. On 16th August 1977, Presley was found unresponsive on the bathroom floor of his Graceland home. Attempts to revive him failed, and death was officially pronounced at 3:30pm at Baptist Memorial Hospital, Tennessee. Controversy surrounded the autopsy and its aftermath as the competence and ethics of two medical professionals were questioned. One had his license permanently revoked, after charges were brought by the Tennessee Medical Board. An overdose of prescription drugs (including codeine, Demerol, morphine, and Valium, to name a few) had apparently caused Elvis' heart to beat irregularly and then stop. However, rumors of a cover-up were rife as the exact cause of death remained unclear. Vernon Presley, Elvis' father, had the complete autopsy report sealed for a period of 50 years. The controversy surrounding the autopsy – as well as various other questionable circumstances – led conspiracy theorists to proclaim that the King was still alive. After his funeral, there were numerous alleged sightings of Presley. A long-standing theory among fans was that he had faked his own death. Some fans noted apparent discrepancies in the death certificate, a curious different spelling of the middle name on his gravestone, him reportedly cutting family members out of his will shortly before he died, rumors of a wax dummy in his original coffin and numerous accounts of Presley planning a diversion so he could retire in peace. The full autopsy report is finally made public on 16th August 2027, exactly half a century after Presley's death. Opening of the New Central Polish Airport. The New Central Polish Airport (also known as "Solidarity Transport Hub") is a megaproject to construct a new, built-from-scratch airport located 40 km southwest of Warsaw, Poland. This replaces the aging and overcrowded Warsaw Chopin Airport. Initially opening with two runways and a capacity of 45 million passengers per year, a substantial upgrade occurs in subsequent years – boosting its capacity to around 100 million, with four runways. Alongside a major expansion of the surrounding rail network, this forms the largest transportation hub in central and Eastern Europe, serving as a gateway to and from Asia. In terms of total passenger numbers, it eventually rivals some of the busiest airports in the world, including the likes of Hartsfield–Jackson in Atlanta, Beijing Capital International Airport, and Dubai International Airport. More than 100 Polish cities throughout the country receive a direct rail link to the new airport, allowing for connections to the airport-rail hub from the most important urban centers in Poland. This includes high-speed routes, with trains running at 250 km/h (155 mph) on some sections. The project faces considerable opposition from local residents, as well as concerns over its financial viability, but is approved by the government. A series of design concepts by world-leading architects emerged in 2019, with a winning candidate selected in 2020. Costing 35 billion złoty ($9.4 billion), the initial phase completes construction by 2027. Other airports in Europe are undergoing major expansions in this decade, as global demand for air travel continues to increase. For example, a third runway opens at Heathrow Airport, London, in 2029. The New Central Polish Airport allows Poland to compete with western European countries on the aviation market and establish a greater share of international traffic. This generates an extra $7 billion in annual GDP and 65,000 new Polish jobs by 2035. Completion of the Australia–ASEAN Power Link. The Australia–ASEAN Power Link (AAPL) is a clean energy megaproject that includes the world's largest solar plant, the world's largest battery, and the world's longest submarine power cable. The solar photovoltaic (PV) plant is located near Elliott in Australia's Northern Territory. The PV modules cover 12,000 hectares (30,000 acres) in a region with some of the best sunlight in the world, which enables a generating capacity of 10,000 MW. For comparison, the largest operating solar farm in 2020, the Bhadla Solar Park in northwest India, had a nameplate capacity just 22.4% as high. With a total area of 120 km², the AAPL is large enough to be visible from space. Running north from the AAPL is a high-voltage, direct current (HVDC) transmission line, which delivers electricity for export to southeast Asia. This passes through Darwin on Australia's north coast, before continuing as an undersea power cable, stretching a total of 4,500 km (2,800 mi). Most of the exported power is supplied to Singapore, enough to provide 20% of the city's total electricity. Batteries in both Darwin and Singapore produce load-balancing of 30GWh to account for variability in sunlight and energy demand, ensuring a continuous supply. Developed by Sun Cable, a Singaporean firm founded in 2018, the total cost of the AAPL is S$22 billion (US$16.5 billion). Construction begins in mid-2023 with operations starting in early 2026 and full completion by late 2027. With its sheer size, capacity, and cable length, the AAPL forms the major backbone of a continent-wide super grid that emerges in Australasia and Oceania during the mid-21st century. Further into the future, this vast electrical infrastructure is boosted even more, by the addition of room-temperature superconductors that provide lossless transmission.

2028: China's economy surpasses that of the U.S. In 2028, China overtakes the United States in gross domestic product (GDP) and becomes the world's largest economy. This milestone has occurred earlier than some expected, due in part to the impact of the COVID-19 pandemic. The virus responsible for COVID-19 had first been identified in Wuhan, China, in December 2019. It spread rapidly, helped by the Chinese New Year migration and Wuhan being a transport hub and major rail interchange. With cases growing exponentially, major disruption began to emerge in early 2020 and many Chinese cities went into lockdown. As the virus spread to other countries, the World Health Organization declared a Public Health Emergency of International Concern in January and a pandemic in March 2020. Initially, COVID-19 had been a major crisis for China. However, its authoritarian government imposed strict measures throughout the country, allowing it to bring the outbreak under control by the second quarter of 2020. While many berated the Chinese government for its delayed response and censorship of related information during the opening stages of the outbreak, few would doubt the success of China in essentially defeating the virus. By contrast, most other countries around the world had lengthy and catastrophic experiences with COVID-19, wreaking severe economic damage. The U.S. recorded its millionth case by April 2020, and this had mushroomed to over 23 million by January 2021, with nearly 400,000 deaths, the most of any country. Although vaccines began to emerge, partisan divides regarding the outbreak and a laxer approach to preventing infections meant that the crisis would persist for many months to come. The U.S. and other major economies had negative growth for the year, while China achieved positive GDP growth of 2% in 2020. China's 14th Five Year Plan (2021-25) involved a "dual circulation" strategy, with plans for both external and domestic demand working together. This had been implemented partly in response to trade disputes with the U.S. and elsewhere, and partly due to expectation that external demand would be depressed by the pandemic and that China needed internal demand to sustain growth. President Xi stated his aim as being to "fully bring out the advantage of its (China's) super-large market scale and the potential of domestic demand to establish a new development pattern featuring domestic and international circulations that complement each other." With skillful management of the pandemic and damage to long-term growth in the West, the relative economic performance of China exceeded many analysts' earlier forecasts. China saw average economic growth of more than 5.5% a year during this time. By the end of its 14th Plan period in 2025, it had comfortably met the threshold for a high-income economy, defined by the World Bank as a gross national income per capita of US$12,536 or more. Although its growth has slowed to 4.5% a year during the second half of the 2020s, China's GDP is overtaking the U.S. by 2028. It continues to widen the gap between itself and the U.S. during the early 2030s, though at a slower rate of less than 4% a year. Longer-term demographic and other problems in China enable the U.S. to regain its position as the world's largest economy by the end of the century. Completion of the Lunar Gateway. The Lunar Gateway (LG) is the successor to the aging International Space Station (ISS). Whereas the ISS was placed in orbit around the Earth, the LG is close to the Moon. The partners involved in its construction are similar to the ISS: the Canadian Space Agency (CSA), European Space Agency (ESA), Japan Aerospace Exploration Agency (JAXA), and NASA. The Gateway is developed, utilized, and maintained in collaboration with commercial and international partners as a "staging ground" for lunar surface operations (both robotic and crewed) and for eventual travel to Mars. By sending people and cargo to and from cislunar space, those involved in the project gain the knowledge and experience necessary to venture to the Moon and beyond. Originally, NASA had intended to build the Gateway as part of an "Asteroid Redirect Mission", but later cancelled this plan. An informal joint statement on cooperation between NASA and Ros cosmos was announced in September 2017. However, in October 2020, Ros’s cosmos stated that the program would be too "U.S.-centric" for Russia to participate in, and in January 2021 confirmed that it would not take part in the LG. NASA commissioned studies by private companies into affordable ways to develop the station's power and propulsion elements – these private companies were Boeing, Lockheed Martin, Orbital ATK, Sierra Nevada and Space Systems/Loral. The LG would-be built-in stages. Originally, the plan had involved each part being delivered solely by the Space Launch System (SLS), a huge new rocket being developed by NASA. However, a subsequent plan incorporated commercial launch vehicle, such as the SpaceX Falcon Heavy. The latter would also carry supplies to the LG using a new craft called the Dragon XL – similar to the commercial cargo program for the International Space Station – to support crewed missions there and to the lunar surface. The international partners involved in the LG planned for its construction to occur gradually from 2024 to 2028, with delivery of four main components: Power and Propulsion Element (PPE) – serving as the main command and communications center of the LG, the PPE can generate 50 kW of solar electric power for its ion thrusters, which can be supplemented by chemical propulsion. An S-band communication system provides a radio link with nearby vehicles, and it can also function as a space tug, transferring a spacecraft from one orbit to another. Alongside the Habitation and Logistics Outpost (see below), the PPE is the first component of the LG to be delivered, launched by SpaceX on a Falcon Heavy in 2024. Habitation and Logistics Outpost (HALO) – a crew cabin for astronauts visiting the LG. Its primary purpose is to provide basic life support needs for the visiting astronauts and to enable preparation for trips down to the lunar surface. In addition to environmental controls and energy storage, it features space for science and stowage, data handling capabilities, and docking ports for visiting vehicles and future modules. The HALO is delivered in 2024. International Habitation Module (I-HAB) – an additional habitation module, built by ESA in collaboration with Japan. Together, the I-HAB and HALO provide a combined 125 m³ (4,400 cu ft) of habitable volume for the station. Delivery of the I-HAB occurs in 2026. This component also includes a large robotic arm contributed by Canada's space agency. European System Providing Refueling, Infrastructure and Telecommunications (ESPRIT) – a service module with an airlock for science packages, additional capacity for xenon and hydrazine propellant, and communications equipment. It also features docking ports and a small, windowed habitation corridor. The ESPRIT module consists of two parts and is fully installed by 2027. The complete assembly of the Lunar Gateway requires about 25 launches from Earth, most of which are uncrewed. The final delivery occurs in 2028 and the station then begins full operations, serving as a platform for missions down to the Moon. These surface operations are initially robotic but enable construction of a permanent human base in the 2030s. The LG has an expected service time of about 15 years, allowing it to operate into the early 2040s. Towards the end of this period, NASA begins testing a new and separate vehicle designed for crewed missions to more remote destinations, such as Mars. Known as the Deep Space Transport (DST), this can carry up to six astronauts on extended voyages, using both electric and chemical propulsion. The DST is returned to the LG after each mission to be serviced and reused for a new mission. Overall, the Lunar Gateway lives up to its name as a "gateway" to places beyond low-Earth orbit (LEO) and is arguably the logical next step for human exploration of space. In addition to being a platform for regular visits to the lunar surface and functioning as a relay between the Earth and Moon, it also provides opportunities to test new technology for reaching Mars. Launch of the European ATHENA X-ray observatory. The Advanced Telescope for High Energy Astrophysics (ATHENA) is a major new X-ray telescope launched by the European Space Agency. This L-class (Large) project is the second of three missions in the "Cosmic Vision" program which includes two other spacecraft – the Jupiter Icy Moon Explorer (JUICE) launched in 2022 and a gravitational wave observatory being deployed in 2034. X-ray observations are crucial for understanding the structure and evolution of stars, galaxies, and the Universe as a whole. These images can reveal "hot spots" in the Universe – regions where particles have been energized or raised to very high temperatures by strong magnetic fields, violent explosions, and intense gravitational forces. X-ray sources are also associated with the different phases of stellar evolution such as supernova remnants, neutron stars and black holes. ATHENA is designed to answer a number of important questions in astrophysics: What happens close to a black hole? How did supermassive black holes grow? How do large-scale structures (i.e., galaxy clusters and superclusters) form? What is the connection between these processes? To address these questions, it can trace orbits close to the event horizon of black holes, measure black hole spin for several hundred active galactic nuclei (AGN), use spectroscopy to characterize the outflows and environments of AGN at their peak activity, look for supermassive black holes out to redshift z = 10, map the bulk motions and turbulence in galaxy clusters, find missing baryons in the cosmic web using background quasars, and observe the process of cosmic feedback where black holes inject energy on galactic and intergalactic scales. This enables astronomers to understand better the history and evolution of matter and energy – visible and dark – as well as their interplay during the formation of the largest structures in the Universe. Closer to home, observations constrain the equation of state in neutron stars, black hole spin demographics, when and how elements were created and dispersed into the intergalactic medium, and much more. To achieve these goals, ATHENA requires a collecting area of 3 square meters with 5 arcsec angular resolution and 12-meter focal length, for unmatched sensitivities. Relative to previous X-ray missions, it offers a 100-fold increase in the area for high resolution spectroscopy, deep spectral and microsecond spectroscopic timing with high count rate capability. It also features a large shield that blocks light from the Sun, Earth, and Moon, which otherwise would heat up the telescope and interfere with observations. The telescope remains operational until the late 2030s. China builds the world's largest particle accelerator. Following the success of the Large Hadron Collider (LHC) in Europe, the Chinese decided to build their own larger particle accelerator. Researchers at the Institute of High Energy Physics in Beijing announced plans for a machine 52 km (32.5 mi) in length – twice the circumference of the LHC. This would allow the Higgs boson to be studied in greater detail, revealing new insights into the fundamental structure of matter, and confirming whether multiple types of Higgs boson existed. Construction began in 2019, with completion in 2028. It paves the way for an even larger project in 2035. Printed electronics are ubiquitous. The printed electronics market has seen exponential growth. By now, it has ballooned to over $300 bn globally. This technology began with a small number of niches, high-end products. It expanded rapidly in the 2010s, thanks to plummeting costs and improved production methods. By the 2020s it had exploded into the mainstream – creating a new generation of ultra-thin electronics. Today, these have such low fabrication costs that they are ubiquitous in countless everyday business and consumer applications. Many previously bulky or heavy devices can now be folded, stored, or carried as easily as sheets of paper. This includes flexible TV displays that can be rolled or hung like posters. Also widespread are electronic newspapers with moving pictures, "smart" packaging, and labels with animated text, along with signage in retail outlets that can be updated shop-wide at the touch of a button. Multimedia players with expandable, fold-out touchscreens are especially popular. Even low-end models are now the size and weight of credit cards and can easily fit inside a wallet. With petabytes of storage, gigapixels of screen resolution and superfast transfer speeds, they are orders of magnitude more powerful than iPods of the previous decade. They are also completely wireless – no cables or physical connections of any kind are required, with music being enjoyed using wireless earphones. The UK population reaches 70 million. Britain will soon become the most populous country in Europe, overtaking both Germany and France. This is mainly due to large numbers of immigrants. Combined with a shrinking labor force, this is putting a major strain on public services – especially in London, which has borne the brunt of the increase. British newspapers are going out of circulation. By the late 2020s, the last of Britain's national newspapers are being taken out of circulation. Even once formerly major titles like the Sun, the Daily Mail and the Daily Mirror have ceased production. The surviving newspapers have now all transitioned to entirely digital formats. The printing industry had a long history in Britain. The first printing press was invented by William Caxton in 1476. This led to further developments in mechanical movable type and a huge increase of printing activities over subsequent centuries. During the 1600s, various publications would spread both news and rumors – such as pamphlets, posters, and ballads. The English Civil War (1642–1651) greatly increased the demand for news. Among the first real "newspapers" were the Oxford Gazette (1665), Berrow's Worcester Journal (1690) and Daily Courant (1702). By the 1720s, there were 12 London newspapers and 24 provincial papers. The first English journalist to achieve national importance was Daniel Defoe (1660–1731). During the 18th and 19th centuries, the Industrial Revolution allowed production methods to be improved, print runs to be greatly increased and newspapers to be sold at lower cost. Circulation of The Times rose from 5,000 copies in 1815 to 10,000 in 1834 and 40,000 by 1851: about 80% of the entire market. The period from 1860 to 1910 was considered a "golden age" of newspaper publication, with further technical advances in printing and communication – combined with a more professional style of journalism and the prominence of new owners. Socialist, labor and trade union papers began to proliferate. In 1896, The Daily Mail was first published and became the first daily newspaper aimed at the newly literate "lower-middle class market resulting from mass education, combining a low retail price with plenty of competitions, prizes and promotional gimmicks." It was the first British paper to sell a million copies a day. Two other "halfpenny" papers to emerge included the Daily Express and the Daily Mirror. By the 1930s, over two-thirds of the population was estimated to read a newspaper every day, with almost everyone taking one on Sundays. Circulations continued to increase, reaching a peak in the mid-20th century. From the 1960s onwards, however, sales began to decline. In an effort to attract more readers, some tabloids – including The Sun, the Daily Mirror, and Daily Star – began publishing images of topless women. The 1980s saw the introduction of computer-based typesetting and full-color offset printing. The reporting of stories became ever more sensationalized and controversial as the fall in sales continued through the 1990s and into the 21st century. The rapid rise of the Internet – providing instant and free access to information – accelerated the decline of the newspaper industry. A major factor was the emergence of smartphones, tablets and other handheld, web-enabled devices, becoming cheap and widely available. By 2015, none of the remaining UK papers had a daily circulation above two million. The overall circulation of newspapers declined by 6.6% in 2014–15, with further declines in the following decade, resulting in the end of printed national newspapers in Britain. Launch of the Comet Interceptor. Comet Interceptor is a mission by the European Space Agency (ESA) to rendezvous with a comet originating from the outer Solar System that has now begun to approach the Sun. This follows a similar effort – Rosetta – that visited 67P/Churyumov-Gerasimenko in 2014. However, unlike that earlier probe, this new craft is designed to encounter a "pristine" comet with largely undisturbed material surviving from the dawn of the Solar System. The targeted body is therefore an 'Oumuamua-like interstellar object, or possible fragment from the Oort cloud, approaching the inner Solar System for the first time, as opposed to a short-period comet like 67P that orbits the Sun every six years. The mission is unusual, in that it launches before a primary target has even been found. For a dynamically-new comet (DNC) or interstellar object, the time between discovery, perihelion, and departure from the inner Solar System – typically a few months to a year – is too short for mission organizers to prepare and launch a new spacecraft. As such, these astronomical objects can only be encountered after being discovered inbound, with enough warning to direct an already operating spacecraft to approach. However, new observatories, such as the recently completed Large Synoptic Survey Telescope, are now improving this time by covering large areas of sky more deeply and rapidly. The spacecraft is separated delivered to Lagrange point L2 via its own propulsion system. Comet Interceptor consists of a mothership and two smaller "daughters", which perform simultaneous observations from multiple angles to generate a 3D profile. The daughter probes carry instruments on different trajectories through the comet's tail, while also getting close to the nucleus. Together, all three spacecraft greatly improve the understanding of pristine comets and their surrounding environment, revealing their structures and compositions in more detail and their dynamic nature as they interact with the solar wind. This, in turn, provides new insights into the conditions that existed at the birth of the Solar System and perhaps even further back in time. Completion of the Brenner Base Tunnel. The Brenner Base Tunnel is a 64 km (40 mi) freight and passenger train route, running through the base of the Eastern Alps Mountain range and linking Scandinavia with the Mediterranean. It consists of two primary tunnels, carrying traffic from north to south (and vice versa). A smaller secondary tunnel lies below these and is used during construction as a guide tunnel to determine geological conditions, then later used for drainage and emergency access. The €9.2bn ($11bn) project receives substantial funding from the European Union (EU). It forms the central component of the gigantic SCAN-MED Corridor – beginning in Finland and running all the way down to Malta – which forms a crucial north-south axis for the European economy. In addition to the Brenner Base Tunnel in central Europe, another major component of the SCAN-MED Corridor is the Fehmarn belt Fixed Link in the northern part of the route. This undersea tunnel provides a direct link between northern Germany and Lolland, and from there to the Danish Island of Zealand and Copenhagen, becoming the world's longest combined road and rail tunnel. The Fehmarn belt Fixed Link is completed shortly after the Brenner Base Tunnel. Maximum operating speeds in the Brenner Base Tunnel are 250 km/h (155 mph) for passenger trains and 160 km/h (99 mph) for freight trains. This helps to relieve a major bottleneck on the alpine connections between Germany and Italy. Previously, journeys had involved meandering rail and congested roads with steep inclines. By contrast, the Brenner Base Tunnel is an entirely new route passing smoothly and directly through tens of kilometers of rock. At a depth of 1,720 m (5,640 ft) below the surface, it becomes the lowest passage through the Alps – cutting travel times from 80 to just 25 minutes. A major technical challenge had been the four distinct rock types and one of Europe's longest fault lines, through which the tunnel would need to pass. Phase I lasted from 1999 until 2002 and involved preparatory work. Phase II followed from 2003 to 2008 with prospection borings, environmental planning, and the final technical details. The construction phase of the main tunnel began in 2011. The project involves a total of 21.5 million cubic meters of material being excavated, with around one-third of this being recycled and reused as concrete aggregate in the tunnel. The remaining material goes to five sites above ground, acting as filler for areas that can be reforested and landscaped. When completed in 2028, the Brenner Base Tunnel becomes the longest underground railway connection in the world, surpassing the 57 km (35 mi) Gotthard Base Tunnel. Delhi becomes the most populous city in the world. By 2028, Delhi has overtaken Tokyo to become the most populous city in the world. The Japanese capital had held the title since 1955, but during the early years of the 21st century it began to reach a plateau. After peaking at 37 million, the city actually went into decline from 2020 onwards. The Indian capital, by contrast, was surging ahead. Home to some 29 million people in 2018, Delhi expanded to reach 37.2 million just a decade later. India as a whole has recently surpassed China to become the most populous country on the planet. There are many challenges associated with rapidly growing urban areas, especially in low-income and lower-middle-income countries. These include the provision of adequate housing, transportation, water, waste management and sanitation, energy, and other vital infrastructure, as well as employment and basic services such as education and health care. However, India's workforce is young and dynamic; its economy is expanding fast and on course to rival the other major superpowers by 2040. One surprising area in which Delhi will soon benefit is the environment. For many years, the World Health Organization had ranked Delhi as the most polluted city on Earth. In the 2010s, poor air quality was causing 2.3 million deaths in India each year – almost the same as from tobacco use – costing 3% of the country's Gross Domestic Product (GDP). However, towards the end of the 2020s, traditional petrol and diesel cars are being phased out, in favor of electric vehicles. Widespread use of solar and other renewables is also making a difference now, with almost 60% of India's electricity being generated from non-fossil fuels. Furthermore, as part of its commitment to the Paris climate agreement, India had pledged $6.2 billion to reforest 235 million acres (95 million hectares) of the country by 2030. This vast project will soon increase India's forest cover from 21% of total land area to 33%. Los Angeles hosts the Summer Olympic Games. From 21st July to 6th August 2028, the 34th Summer Olympics are held in Los Angeles, California. This event is the fifth Summer Games to be hosted in the United States, and the third in Los Angeles – following St. Louis 1904, Los Angeles 1932, Los Angeles 1984, and Atlanta 1996. Los Angeles also becomes the third city after London (1908, 1948 and 2012) and Paris (1900, 1924 and 2024) to have hosted the Olympic Games on three occasions. The 2028 Games are spread across four areas, each highlighting the different geographical features of the city: Long Beach, South Bay, Downtown and Valley Sports Park. Travel between venues is made easy thanks to L.A.'s extensive system of highways and public transport, with many improvements and upgrades having been made since the 1984 Games. By 2028, $88bn worth of expanded subway, light rail, rapid bus transit, and express lane projects are operational, connecting all sports parks, the airport, the Games Centre, and every corner of L.A. The Olympic and Paralympic Village is based at the centrally located UCLA campus, close to the city's cultural and entertainment attractions. All Olympic and Paralympic sports parks are within 40 minutes of the Village. The opening and closing ceremonies are each, for the first time, staged across two different stadiums. The opening ceremony starts at the Los Angeles Memorial Coliseum and finishes at the Los Angeles Stadium at Hollywood Park (the latter forming part of a major new sports, entertainment, hotel, and business district). The order is reversed for the closing ceremony. While most Olympic host cities have seven years to prepare, Los Angeles was given an additional four, for a total of 11 years. This was due to the unusual bidding process in 2017, which saw Paris and Los Angeles elected simultaneously for 2024 and 2028, respectively. Total solar eclipse in Australia and New Zealand. A total solar eclipse occurs on 22nd July 2028. Totality occurs in a narrow path across the Earth's surface, with a partial solar eclipse visible over a surrounding region thousands of kilometers wide. The central line of the path crosses the Australian continent from the Kimberley region in the northwest and continues in a southeasterly direction through Western Australia, the Northern Territory, southwest Queensland, and New South Wales, close to the towns of Wyndham, Kununurra, Tennant Creek, Birdsville, Bourke, and Dubbo. It continues on through the center of Sydney, where the eclipse has a duration of over three minutes. It also crosses Dunedin, New Zealand. The eclipse also starts the final battle of the entitys, this battle is the last time entitys fight for balance, it is so powerful that it causes the timeline to split, if scarlet wins, it ends there, but if balance wins, then the world is saved, after this, jecten the current entity of balance uses his restore power to balance the world and remove its disasters, this also causes jecten to lose his powers, this is because, once balance is restored, then the world doesn’t need the entity of balance.

2029: Human-like AI is becoming a reality. By the end of this decade, a milestone is reached in artificial intelligence, with computers now easily passing the Turing Test. This test is conducted by a human judge who is made to engage in a natural language conversation with one human and one machine, each of which tries to appear human. Participants are placed in isolated locations. For several decades, information technology had seen exponential growth – leading to vast improvements in computer processing power, memory, bandwidth, voice recognition, image recognition, deep learning, and other software algorithms. By the end of the 2020s, it has reached the stage where an independent judge is never able to tell which is the real human and which is not. Increased automation of banking. During the early 21st century, a wave of new technology swept the banking sector, as companies worked to improve costs and efficiencies. Online services, combined with mobile apps, automated chatbots, virtual telephone assistants, increasingly sophisticated ATMs, and other features all made it quicker and easier than ever to manage one's personal finances. On the business and corporate side, the rise of cloud computing, big data and intelligent algorithms yielded significant time and cost savings in research, marketing, processing, and other areas, while giving more power and insight to decision makers. By 2020, U.S. banks alone were investing more than $150 billion in new technology annually, spending more than any other industry. While customers and clients benefited greatly from these improvements, automation reduced the need for human staff. By 2029, more than 10% of banking jobs have been eliminated compared to a decade earlier, adding up to some 200,000 roles in the U.S. Retail branches and call centres are the hardest hit, where over a quarter of staff are no longer needed. Commercial and investment bank employees are less vulnerable for now, but even these jobs become threatened in subsequent decades as decision making is delegated to ever more powerful hardware and algorithms. Close approach of the near-Earth asteroid Apophis. 99942 Apophis is an asteroid with a diameter of 370m (1,214 ft) and an exceptionally close orbit to Earth. It caused a sensation in 2004 when initial observations indicated a 2.7% chance that it would hit Earth in 2029. This gave it the highest rating ever recorded on the Torino impact hazard scale, reaching level 4. Depending on its composition, location and angle of entry, the asteroid would make atmospheric entry with up to 1,200 megatons of kinetic energy – approximately 21 times more than the Tsar Bomba, the largest nuclear weapon ever tested – and six times more than the 1883 eruption of Krakatoa. However, the orbital characteristics of Apophis were later refined, eliminating any chance of an impact. Nevertheless, the object remains on an alarmingly close path that skims within 31,000 km of Earth, which is 10 times closer than the Moon, and even closer than some man-made satellites. Apophis is triple the size of 2019 OK, another large object that passed by in July 2019 and comes even closer than that earlier asteroid. Its close passing sparks further debate about the need to defend Earth from potential impacts. Mass application of gene drives on mosquitoes. During the 2010s, it was estimated that nearly 700 million people were catching mosquito-borne illnesses, resulting in over a million deaths worldwide each year. In spite of the insects' tiny size, they were the deadliest creatures in the world, killing more humans than any other animal. Among the diseases being transmitted were malaria, dengue fever, West Nile virus, yellow fever, chikungunya, filariasis, the Zika virus and many others. Affecting mostly tropical and sub-tropical regions, the majority of deaths were of young children in sub-Saharan Africa. However, in 2015 it was reported that – due to warming temperatures from climate change – mosquitoes had begun spreading historically rare diseases into Europe: malaria to Greece, the West Nile virus to parts of Eastern Europe and chikungunya to Italy and France. Malaria and West Nile virus were becoming more common in North America too. The economic and social costs of mosquito-borne infections were often considerable. An individual could be forced to miss time at work or their place of education, pay for doctors' visits and related travel, obtain funds to cover prescription or over-the-counter medicines, pay for hospital bills or medical treatment, and buy insect sprays or repellents to prevent further bites. There were also funeral expenses in the case of deaths. At a national level, the costs to governments included the maintenance and staffing of health facilities, the purchase of drugs and vaccines, public education programs to warn about epidemics, research to prevent further outbreaks and to improve treatments, lost productivity and overall damage to businesses, negative effects on tourism and sports events, as well as compensation for affected communities. Of the many and various mosquito-borne illnesses, malaria presented the greatest threat, with 3.2 billion people (106 countries and territories) located in areas at risk of transmission. It was a particularly acute problem in Africa, where an estimated 91% of malaria deaths occurred. This disease alone was causing $12 billion in economic damage to the continent each year, reducing the annual Gross Domestic Product (GDP) of some countries by as much as 1.3%. In those nations where the disease was most common, it represented up to 40% of public health spending and could account for 60% of visits to health clinics, along with 50% of hospital admissions. A variety of methods were in use to control mosquitoes – including the elimination of breeding places (such as pools, ditches, old tires, buckets, and other containers of stagnant water); physical barriers like window screens and nets; sprays and repellents; attractant-laced traps; and biological control via fungi and nematodes or predators such as fish. Another strategy was the use of pesticides, but these were gradually becoming less effective as the insects developed resistance, plus the chemicals were damaging to the environment. While research into vaccines and new medications was now showing promise, a more comprehensive and long-term proposal was to eradicate mosquitoes – or at least drastically reduce their ability to spread infections. Known as a "gene drive", it was hoped that such an initiative would either A) cause the extinction of several of the major vectors of malaria (and some other diseases like dengue and Zika) including Aedes albopictus and Anopheles, or B) induce a massive genetic change throughout the mosquito population, making them unable to transmit vectors. The ability to modify animal and plant DNA had been around for decades already. The first genetically modified animal was a mouse created in 1973. The first genetically modified crop, an antibiotic-resistant tobacco plant, was produced in 1982. These organisms were still subject to the laws of genetic inheritance, however, as first described by Gregor Mendel in 1865. In other words, if a mosquito carrying altered DNA were to mate with another mosquito, there would only be a 50-50 chance of the offspring inheriting the modified gene. The probability would be even lower that a subsequent generation would pass on the modified gene, and so on. Gene drives, by contrast, were designed to change the fundamental laws of inheritance. A powerful new editing technique known as CRISPR/Cas9 was demonstrated on fruit flies in the 2010s, showing that it was possible to engineer organisms in which every generation received a modified gene – and an entire population would get it within a few generations. With potential for large-scale transformation of species, attention turned to the problem of mosquitoes in Africa. This process of artificially biasing inheritance of desired genes was described as a "mutagenic chain reaction" by researchers. While holding great promise, there were also a number of risks and ethical issues. By altering mosquitoes' DNA to ensure their offspring would always be male, for example, and releasing them into the wild to mate with natural mosquitoes, malaria and other diseases could be wiped out. However, there could also be side effects. Causing the insects' populations to crash might disrupt food chains or have other unforeseen consequences. Government reports highlighted these and other areas of concern, delaying the mass introduction of gene drives, but this was only a temporary setback as major investment was going into research and development of the process. Among the biggest supporters in the fight against malaria was Bill Gates whose foundation had committed almost $2 billion in grants between 2000 and 2015 to combat the disease. To safeguard against the dangers, a "reversal drive" was developed alongside the gene drive that could, if necessary, undo the process and spread the original genes back into the population. In addition, other alternatives to creating all-male offspring were considered – such as introducing genes to make the insects target animals rather than humans, or genes that stopped the parasite itself from multiplying inside the insect. Laboratory tests were conducted before the introduction of these GM mosquitoes into the wild. Detailed studies then followed to determine the optimal numbers and best locations to release the insects, as well as the various cause-and-effect pathways in the environment. Following small-scale experiments in the early 2020s, the mass introduction of gene drives is becoming a reality towards the end of the decade. After years of mutagenic chain reactions, follow-up assessments and computer models, researchers have pieced together a trove of evidence and data, refining the process to be safe and with acceptable impacts on the environment. These and other issues have now been largely resolved, with most nations in Africa and elsewhere having approved the use of gene drives. A sharp drop in malaria is being witnessed by 2029. Other pests and invasive species are being targeted using this method – cane toads in Australia, for example, locusts that swarm and destroy crops; rodents that carry diseases. Unfortunately, given their enormous power and potential, gene drives are also now coming under intense scrutiny for an altogether different and more sinister reason: their ability to be used as bioweapons on human populations. This is leading to increased transparency, openness, and cooperation between governments around the world to ensure better monitoring of scientific research activities and to make safeguards and countermeasures built into gene drives. Global reserves of silver are running out. Silver is a precious metal which has been used by humans for thousands of years in currency, jewellery, and sculpture. In modern times it has a range of applications including dental fillings, mirrors, nuclear reactors, photographic film, solar reflectors, and solder, along with alloys to make silverware, ornaments, and the like. Due to its high conductivity (the highest of any metal) silver is very useful in electronic devices such as radios, antennas, computer keyboards and audio equipment, as well as wires and cables. As most silver ions are toxic to microbes, silver is also used in bandages and medical coatings. Silver has long been a high value commodity, often used for investment and even as the base for entire economies. Mines were once found throughout the world, though the bulk was extracted in China and Poland. Between 1900 and the early 2000s, silver ores as a percentage of geologic formations declined by over 80%. New mine deposit discoveries peaked in the mid-1980s, with average annual discoveries declining over 60% between then and the early 2010s. By the late 2020s, global reserves have reached critically low levels, causing prices to rise considerably, and driving demand for alternative materials. In some applications, silver is able to be replaced by aluminium, copper, and rhodium, but the lower conductivity necessitates new refinement techniques. Recycling is proving effective enough to keep the supply somewhat stable for a few more decades, though demand continues to outpace availability. Finland bans coal for energy use. In 2029, Finland implements a nationwide ban on coal use. The country's greenhouse gas emissions had already peaked in the early 2000s, but it was still reliant on coal for its energy, with 11 million tons burned each year. Lacking any domestic production of its own, this was imported from other nations – primarily Russia (72%), the USA (7%), Canada (6%), Australia (6%) and Poland (3%). The percentage of coal as an energy source in Finland declined from 18.6% in 1980 to 14.7% in 1990 and 11% in 2000, rising again slightly before entering a permanent decline from 2003 onwards and reaching 8% by 2018. The ban was announced in February 2019 and comes into effect from 1st May 2029. Madagascar's radiated tortoise is extinct in the wild. Years of unmitigated hunting and loss of habitat, as well as capture for the illegal exotic pet trade, have caused the wild population of the radiated tortoise to dwindle too almost nothing. Like over 80% of the island's flora and fauna, the radiated tortoise can be found nowhere else on Earth naturally. The government of Madagascar attempted to halt the decline by introducing a series of protection laws. Unfortunately, the size of the island's wildlife areas and poor economic conditions meant these restrictions were all but ignored. Even protected areas were invaded by poachers. Surveys in the 2010s revealed a shocking decline in the number of tortoises. The breeding population quickly shrunk throughout the 2020s. Though there is hope of future repopulation using those bred in captivity, the shrinking habitats they once occupied make this prospect unlikely. In another 15 years or so, almost all of Madagascar's forests will be gone. Phase 1 of the California High-Speed Rail line is complete. The California High-Speed Rail line is a major transportation project to modernise the rail routes on the west coast of the United States, making them fit for the 21st century. It connects Los Angeles to San Francisco, with trains running at speeds of up to 200 miles per hour (320 km/h), enabling journeys between the two cities in under three hours. Plans for a high-speed rail system linking Northern and Southern California were proposed by Governor Jerry Brown in the 1980s, but it took until 2008 for funding to be approved by voters. Even then, criticism remained over the budget, feasibility, legal, environmental, and other issues. Covering 800 miles and 24 stations, the total cost would amount to over $68 billion. A ground-breaking ceremony for the project was held in January 2015. Construction would proceed in various segments. The initial 130-mile (209 km) stretch from Fresno to Bakersfield in the Central Valley would open in 2022, followed by several other sections, leading to Phase 1 completion in 2029. This would be followed by Phase 2, consisting of two extensions: a 110-mile (177 km) route from Sacramento to Merced and a 167-mile (269 km) route from Los Angeles to San Diego. Over a 58-year period (from the start of operations in 2022 through 2080), the system reduces auto travel on the state's highways and roads by over 400 billion miles, a significant reduction in air pollution and carbon emissions for the region. Other benefits include the creation of 450,000 permanent jobs through the new commuters using the system. Alongside the California High-Speed rail line is a private venture known as XpressWest (formerly DesertXpress) that aims to build a high-speed route from Victorville to Las Vegas. Other rail projects are now taking shape across the USA. On the other side of the country, the Washington to Newark section of the Northeast Corridor route is nearing completion. Expansion of London's Heathrow Airport. For many years, Heathrow Airport in London held the title of world's busiest airport when measured by international passenger traffic, as well as the busiest in Europe by overall passenger traffic. By the early 2010s, it was handling 70 million passenger journeys per year (though Dubai International would eventually overtake it) and forecast to reach 85 million by 2030. As the airport began to reach 100% capacity at peak times, concerns were raised about overcrowding and the need for expansion. Globally, air traffic in general was growing rapidly – due in large part to booming demand from the emerging markets of the Asia Pacific and Middle East regions – and the number of planes in service was projected to double between 2015 and 2035. In 2009, the UK's Labour government announced that it would support the expansion of Heathrow and encouraged the airport operator (BAA) to apply for planning permission. However, these plans were cancelled the following year by the new coalition government of the Conservatives and Liberal Democrats. Following several more years of planning and reviews, the Airport Commission recommended a third runway and sixth terminal. A full report was finally published in 2015 that confirmed a new expansion to the northwest as the chosen proposal. After further revisions, and with estimated costs of £14.3bn (funded privately), this northwest runway and terminal plan was approved by the government in 2018. The new infrastructure would increase Heathrow's capacity from 85 million to 130 million passengers annually. It was predicted to create 180,000 new jobs and generate £200bn in total economic benefits by 2050. Up to 40 new long-haul routes would become available. In addition to the airport itself, a number of access improvements were planned – such as the 118 km (73 mile) Cross rail line running all the way from east to west London, along with a refurbished Piccadilly line providing more spacious, air-cooled trains at higher frequencies than before. The pro-expansion stance of both Labour and the Conservatives was supported by many groups and prominent individuals, including the aviation sector, Confederation of British Industry (CBI), British Chambers of Commerce (BCC) and 32 local chambers of commerce, local authorities, the manufacturing and freight sector and trade unions. However, there was considerable opposition too. Political opponents included the Liberal Democrats, Green Party and UKIP. Sadiq Khan, Mayor of London, and his two predecessors, Boris Johnson, and Ken Livingstone, were also against it. Various campaign groups voiced concerns, including Greenpeace, Friends of the Earth, the RSPB, and WWF, as well as conservation organisation The National Trust, and developmental charities Oxfam and Christian Aid, alongside Hillingdon Council – a local government in West London. Among the biggest concerns were the impacts on local communities. Some 700 homes, a church and eight Grade II-listed buildings would have to be demolished or abandoned (Grade II being a designation given to "buildings [that] are nationally important and of special interest", warranting every effort to preserve them). The main high street in the village of Harmondsworth would need splitting and a graveyard would have to be "bulldozed". The entire village of Sipson was in danger of being wiped from the map. The other major concerns were air and noise pollution, along with greenhouse gas emissions. Environmental campaigners warned that the increased CO2 caused by the additional flights would make it harder for the UK to meet its commitments to the Paris climate agreement. They also pointed to studies showing that the claimed economic benefits would be negated by the long-term costs of carbon. Building a third runway would also raise the general level of air pollution, at a time when 40,000 people were dying each year from dirty air across the country. The UK was already in breach of legal limits and the government had lost multiple court cases on the issue. Furthermore, expanding Heathrow and its flight paths would expose hundreds of thousands of additional residents in London – as well as the neighbouring county of Berkshire – to sustained high levels of aircraft noise for the first time. Many people also resented London gaining yet another major project, drawing money away from the rest of the UK. Other regions were falling behind and there were opportunities for investment in Birmingham and Manchester, for example. The widening gap between London and everywhere else had already created a worrying socio-economic divide. This ever-growing sense of inequality may have been a contributory factor in the Brexit referendum result. The arguments from the opposition side were strong, but those in support were stronger. The importance of Heathrow as an international transport hub and a catalyst for wealth generation in London and the Southeast was deemed too great, with approval granted by the government in June 2018. As with many large projects of this type, however, financial issues and construction delays pushed back the original timetable. Completion of a third runway had been scheduled for 2026 but was subsequently revised to 2029. Launch of the ARIEL spacecraft. The Atmospheric Remote-sensing Infrared Exoplanet Large-survey (ARIEL) is a space observatory launched in 2029 as the fourth medium-class mission of the European Space Agency's Cosmic Vision programme. It aims to observe 1,000+ known exoplanets using the transit method, studying, and characterising the planets' chemical compositions and thermal structures, simultaneously in both visible and infrared wavelengths. ARIEL is launched to the L2 Lagrangian point (pictured below), about 1,500,000 km (930,000 mi) from Earth, where it benefits from the blockage of sunlight to maximise the potential for detecting targets. It provides a wealth of scientific data on a diverse range of exoplanets: from extremely hot, to temperate; and from gaseous to rocky. Its photometer, spectrometer, and guidance system are sensitive enough to determine the presence of clouds in atmospheres, as well as some exotic molecules that may have been missed by earlier telescopes. In addition to revealing new insights into many previously studied worlds, the mission addresses fundamental questions about what exoplanets are made of and how planetary systems form and evolve. The European Space Agency formally approved the mission in November 2020, with a scheduled launch date of 2029. The mission duration is four years. Jupiter's Great Red Spot is disappearing. The first sightings of a Great Red Spot-on Jupiter were made in the 1660s by Robert Hooke and Giovanni Cassini. This feature – a monstrous, anticyclonic storm – was not studied in detail until the late 19th century. At that time, its diameter was found to be around 25,500 miles (41,000 km), large enough to swallow three entire planet Earths. On 25th February 1979, NASA's Voyager 1 spacecraft returned close-range photos of the planet. These revealed that its Great Red Spot had shrunk to 14,500 miles (23,335 km). Later studies by the Hubble Space Telescope, from the 1990s onwards, revealed it continuing to get smaller in size. By 2012, the rate of shrinkage was nearly 1,000 km per year and increasing. It was theorised that small eddy, observed feeding into the storm, were accelerating this change by altering the internal dynamics. The raging winds in this turbulent region were measured to be 384 mph (618 km/h), greatly surpassing even the strongest hurricanes on Earth. Another space probe – Juno – orbited the gas giant in 2016. This provided a greater understanding of the atmospheric composition, cloud motions, temperature, magnetosphere, gravity, and other properties affecting the overall dynamics of Jupiter. Once again, the Great Red Spot was seen shrinking and losing momentum. This process would continue into the following decade. By the end of the 2020s, it has vanished entirely. Yet another probe, Jupiter Icy Moon Explorer (JUICE), arrives in 2030, just after its disappearance. The wreck of the Titanic has decomposed. By the late 2020s, the famous wreck of the Titanic has been reduced to a mere rust patch. Metal-eating bacteria have dissolved what remains of the once mighty structure. Though some artifacts were salvageable, any hope of recovering the ship itself is now gone. The number of satellites in Earth orbit has grown exponentially. By the end of this decade, the number of active satellites in orbit around Earth has reached nearly 60,000 – a 20-fold increase compared to a decade earlier and a 75-fold increase since the start of the century. Russia (then known as the Soviet Union) launched the first artificial satellite, Sputnik 1, in October 1957. Just three-and-a-half years later, more than 100 others had been placed in orbit as the Space Race got underway between the Soviets and the USA. The numbers continued to increase over the decades with more and more countries able to conduct launches for purposes such as weather monitoring, telecommunications, surveillance, and GPS. During the early 21st century, smaller and lighter satellites with more compact and miniaturised components became increasingly popular. These provided the advantage of cheaper launch costs and a more flexible range of uses. This segment of the market became divided into various classes depending on the weight of the spacecraft – from "mini" satellites (between 500–100 kg), descending through micro, nano, pico and femto satellites, the smallest being less than 0.1 kg. Concerns mounted over the accumulation of space debris from old satellites and other defunct or abandoned objects. In a worst-case scenario, this might trigger a catastrophic chain reaction of collisions – known as the Kessler syndrome – with potential to block safe access to space for generations. Despite the growing risk, satellite numbers began to explode from the late 2010s onwards. SpaceX announced plans for Star link, a huge constellation to expand global Internet coverage, as well as satellites for military, scientific and other purposes. Many rival companies emerged with similar projects that collectively appeared set to transform the already crowded space above Earth, creating a kind of "satellite singularity". The 2,600 active satellites in 2020 would be joined by a further 57,000 over the next 10 years, bringing the total to almost 60,000. Astronomers had earlier reported interference with ground-based observations and the spoiling of the night sky, a problem now made even worse. Satellite developers responded with new measures to reduce reflected sunlight through spacecraft orientation, sun shielding, and surface darkening. The end of this decade is marked by the emergence of a quantum Internet, providing absolutely secure encryption. This orbiting network, still at an early stage of deployment in 2029, is massively expanded during the 2030s.

2030: Global population is reaching crisis point. Rapid population growth and industrial expansion is having a major impact on food, water, and energy supplies. During the early 2000s, there were six billion people on Earth. By 2030, there are an additional two billion, most of them from poor countries. Humanity's footprint is such that it now requires the equivalent of two whole Earths to sustain itself in the long term. Farmland, fresh water, and natural resources are becoming scarcer by the day. The extra one-third of human beings on the planet means that energy requirements have soared, at a time when crude oil supplies are in terminal decline. A series of conflicts has been unfolding in the Middle East, Asia, and Africa, at times threatening to spill over into Europe. With America involved too, the world is teetering on the brink of a major global war. There is the added issue of climate change, with CO2 levels reaching almost 450 parts per million. As a result, natural feedbacks are kicking in on a global scale. This is most apparent in the Arctic, where melting permafrost is now venting almost one gigatons of carbon annually. There are signs that a tipping point has been reached, which is manifesting itself in the form of runaway environmental degradation. Nature's ecosystems are changing at a speed and scale rarely witnessed in Earth's history. This is also adding to food shortages, crop yields falling by up to one third in some regions and prices of some crops more than doubling, with devastating impacts on the world's poor. The urban population, which stood at 3.5 billion in 2010, has now surged to almost 5 billion. Resource scarcity, economic and political factors, energy costs and mounting environmental issues are forcing people into ever more crowded and high-density areas. Many cities are merging to form vast sprawling metropolises with hundreds of millions of people. In some nations, those living in urban areas make up over 90% of the population. By 2030, urban areas occupy an additional 463,000 sq mi (741,000 sq km) globally, relative to 2012. This is equivalent to more than 20,000 new football fields being added to the global urban area every day for the first three decades of the 21st century. Almost $30 trillion has been spent during the last two decades on transportation, utilities, and other infrastructure. Some of the most substantial growth has been in China which boasts an urban population approaching one billion and has spent $100 billion annually just on its own projects. Much of the Chinese coastline has been transformed into what is essentially a giant urban corridor. Turkey is another region that has witnessed phenomenal urban development. All of this expansion is having a major impact on the surrounding environment. In addition to cities, new networks of road, rail and utilities have been built, crisscrossing the landscape, and cutting through major wildlife zones. What were previously protected areas are now opening up for resource exploitation and food production. Numerous species are reclassified as endangered during this period as a result of human encroachment, pollution, and habitat destruction. The accelerating magnitude of these and other problems is leading to a rapid migration from traditional fossil fuels to renewable energy. Advances in nanotechnology have resulted in greatly improved solar power. In some countries, such as Japan, photovoltaic materials are being added to almost every new building. Energy supplies in general are becoming more localised and efficient. This transition is putting increasing strain on fossil fuel companies, since the proven reserves of oil, coal and natural gas far exceed the decided "safe" limit for what can be burned. Because most reserves had already been factored into the market value of these organisations, they now face the prospect of huge financial loss. In response, many companies are fighting tooth and nail against further regulation. Another issue which governments have to contend with during this time is the aging population, which has seen a doubling of retired persons since the year 2000. People are living longer, healthier lives. With state pension budgets under increasing strain, the overall effect is a decreased income for senior citizens. Retirement ages are increasing in America, Asia and most European countries, many employees are forced to work into their 70s. Stress levels for the average person have continued to increase, as the world adapts to these various crises. China's Long March 9 rocket begins lunar missions. The Long March 9 (officially CZ-9) is a new Chinese rocket, first announced in 2018 and intended for long range missions to the Moon, Mars and beyond. With a payload capacity of 140,000 kg to low Earth orbit (LEO) and 50,000 kg to trans-lunar injection, it ranks among the largest rockets ever built – one of the very few appearing in the "super heavy-lift" launch vehicle class. The Long March 9 is a three-staged rocket with a large core having a diameter of 10 metres, surrounded by a cluster of four engines. Comparable in size to NASA's retired Saturn V, this huge rocket is specifically designed to expand China's capabilities beyond Earth and deeper into space. Sitting atop the rocket is a next-generation, lunar-capable spacecraft with capacity for up to six astronauts. The Long March 9 completed feasibility studies in 2021 and received government approval that same year. The 14th Five Year Plan (2021–25) enabled it to proceed to the next stage of development. By 2030, a maiden test flight has occurred, and the launch vehicle is being prepared for use in lunar missions. Following additional test flights, China lands its first astronauts on the Moon in the early part of this decade. This is taking place alongside similar efforts by the United States, which now has its own lunar-capable rocket – NASA's Space Launch System – as well as commercial ventures such as SpaceX and Blue Origin. The two nations, having been engaged in a second space race for the last two decades, are now finally seeing the fruits of their long-term research and development. The Long March 9 forms a pivotal part of China's operations on the Moon, not only for sending astronauts on short duration missions but also for establishing a more permanent presence. Its huge cargo-carrying capacity allows a scientific outpost to form on the lunar surface during the late 2030s. The 6G standard is released. By 2030, a new cellular network standard has emerged that offers even greater speeds than 5G. Early research on this sixth generation (6G) had started during the late 2010s when China, the USA and other countries investigated the potential for working at higher frequencies. Whereas the first four mobile generations tended to operate at between several hundred or several thousand megahertz, 5G had expanded this range into the tens of thousands and hundreds of thousands. A revolutionary technology at the time, it allowed vastly improved bandwidth and lower latency. However, it was not without its problems, as exponentially growing demand for wireless data transfer put ever-increasing pressure on service providers, while even shorter latencies were required for certain specialist and emerging applications. This led to development of 6G, based on frequencies ranging from 100 GHz to 1 THz and beyond. A ten-fold boost in data transfer rates would mean users enjoying terabits per second (Tbit/s). Furthermore, improved network stability and latency – achieved with AI and machine learning algorithms – could be combined with even greater geographical coverage. The Internet of Things, already well-established during the 2020s, now had the potential to grow by further orders of magnitude and connect not billions, but trillions of objects. Following a decade of research and testing, widespread adoption of 6G occurs in the 2030s. However, wireless telecommunications are now reaching a plateau in terms of progress, as it becomes extremely difficult to extend beyond the terahertz range. These limits are eventually overcome but require wholly new approaches and fundamental breakthroughs in physics. The idea of a seventh standard (7G) is also placed in doubt by several emerging technologies that support the existing wireless communications, making future advances iterative, rather than generational. Desalination has exploded in use. A combination of increasingly severe droughts, aging infrastructure and the depletion of underground aquifers is now endangering millions of people around the world. The on-going population growth described earlier is only exacerbating this, with global freshwater supplies continually stretched to their limits. This is forcing a rapid expansion of desalination technology. The idea of removing salt from saline water had been described as early as 320 BC. In the late 1700s it was used by the U.S. Navy, with solar stills built into shipboard stoves. It was not until the 20th century, however, that industrial-scale desalination began to emerge, with multi-flash distillation and reverse osmosis membranes. Waste heat from fossil fuel or nuclear power plants could be used, but even then, these processes remained prohibitively expensive, inefficient, and highly energy intensive. By the early 21st century, the world's demand for resources was growing exponentially. The UN estimated that humanity would require over 30 percent more water between 2012 and 2030. Historical improvements in freshwater production efficiency were no longer able to keep pace with a ballooning population, made worse by the effects of climate change. New methods of desalination were seen as a possible solution to this crisis and a number of breakthroughs emerged during the 2000s and 2010s. One such technique – of particular benefit to arid regions – was the use of concentrated photovoltaic (CPV) cells to create hybrid electricity/water production. In the past, these systems had been hampered by excessive temperatures which made the cells inefficient. This issue was overcome by the development of water-filled micro-channels, capable of cooling the cells. In addition to making the cells themselves more efficient, the heated wastewater could then be reused in desalination. This combined process could reduce cost and energy use, improving its practicality on a larger scale. Breakthroughs like this and others, driven by huge levels of investment, led to a substantial increase in desalination around the world. This trend was especially notable in the Middle East and other equatorial regions, home to both the highest concentration of solar energy and the fastest growing. However, this exponential progress was dwarfed by the sheer volume of water required by an ever-expanding global economy, which now included the burgeoning middle classes of China and India. The world was adding an extra 80 million people each year – equivalent to the entire population of Germany. By 2017, Yemen was in a state of emergency, with its capital almost entirely depleted of groundwater. Significant regional instability began to affect the Middle East, North Africa, and South Asia, as water resources became weapons of war. Amid this turmoil, even greater advances were being made in desalination. It was acknowledged that present trends in capacity – though impressive compared to earlier decades – were insufficient to satisfy global demand and therefore a major, fundamental breakthrough would be needed on a large scale. Nanotechnology offered just such a breakthrough. The use of graphene in the water filtration process had been demonstrated in the early 2010s. This involved atom-thick sheets of carbon, able to separate salt from water using much lower pressure, and hence, much lower energy. This was due to the extreme precision with which the perforations in each graphene membrane could be manufactured. At only a nanometre across, each hole was the perfect size for a water molecule to fit through. An added benefit was the very high durability of graphene, potentially making desalination plants more reliable and longer lasting. Unfortunately, patents were secured by corporations that initially limited its wider use. A number of high-profile international lawsuits were brought, as entrepreneurs and companies attempted to develop their own versions. With a genuine crisis unfolding, this led to an eventual restructuring of intellectual property rights. By 2030, graphene-based filtration systems have closed most of the gap between supply and demand, easing the global water shortage. However, the delayed introduction of this revolutionary technology has caused problems in many vulnerable parts of the world. In the 2040s and beyond, desalination will play an even more crucial role, as humanity adapts to a rapidly changing climate. Ultimately, it will become the world's primary source of freshwater, as non-renewable sources like fossil aquifers are depleted around the globe. "Smart grid" technology is widespread in developed nations. In prior decades, the disruptive effects of energy shocks, alongside ever-increasing demands of growing and industrialising populations, were putting strain on the world's power grids. Blackouts occurred in the worst-hit regions, with consumers becoming more and more conscious of their energy use and taking measures to either monitor and/or cut back their consumption. This already precarious situation was exacerbated by the relatively ancient infrastructure in many countries. Much of the grid at the beginning of the 21st century was extremely old and inefficient, losing more than half of its available electricity during production, transmission, and usage. A convergence of business, political, social, and environmental issues forced governments and regulators to finally address this problem. By 2030, integrated smart grids are becoming widespread in the developed world, the main benefit of which is the optimal balancing of demand and production. Traditional power grids had previously relied on a just-in-time delivery system, where supply was manually adjusted constantly in order to match demand. Now, this problem is being eliminated due to a vast array of sensors and automated monitoring devices embedded throughout the grid. This approach had already emerged on a small scale, in the form of smart meters for individual homes and offices. By 2030, it is being scaled up to entire national grids. Power plants now maintain constant, real-time communication with all residents and businesses. If capacity is ever strained, appliances instantly self-adjust to consume less power, even turning themselves off completely when idle and not in use. Since balancing demand and production is now achieved on a real-time, automatic basis within the grid itself, this greatly reduces the need for "Peaker" plants as supplemental sources. In the event of any remaining gap, algorithms calculate the exact requirements and turn on extra generators automatically. Computers also help adjust for and level out peaks and troughs in energy demand. Sensors in the grid can detect precisely when and where consumption is highest. Over time, production can be automatically shifted according to the predicted rise and fall in demand. Smart meters can then adjust for any discrepancies. Another benefit of this approach is allowing energy providers to raise electricity prices during periods of high consumption, helping to flatten out peaks. This makes the grid more reliable overall, since it reduces the number of variables that need to be accounted for. Yet another advantage of the smart grid is its capacity for bidirectional flow. In the past, power transmission could only be done in one direction. Today, a proliferation of local power generation, such as photovoltaic panels and fuel cells, means that energy production is much more decentralised. Smart grids now consider homes and businesses which can add their own surplus electricity to the system, allowing energy to be transmitted in both directions through power lines. This trend of redistribution and localisation is also making large-scale renewables more viable, since the grid is now adaptable to the intermittent power output of solar and wind. On top of this, smart grids are also designed with multiple full load-bearing transmission routes. This way, if a broken transmission line causes a blackout, sensors instantly locate the damaged area while electricity is rerouted to the affected area. Crews no longer need to investigate multiple transformers to isolate a problem, and blackouts are reduced as a result. This also prevents any kind of domino effect from setting off a rolling blackout. Overall, this new "internet of energy" is far more sustainable, efficient, and reliable. Energy costs are reduced, while paving the way to a post-carbon economy. Countries that quickly adapt smart grids are better protected from oil shocks, while greenhouse gas emissions are reduced by almost 20 per cent in some nations. As the shift to clean energy continues, this situation will only improve, expanding to even larger scales. Regions begin merging their grids together on a country-to-country, and eventually continent-wide, basis. An interstellar message arrives at Luyten's Star. Luyten's Star (GJ 273) is a red dwarf located about 12.4 light-years from Earth. Despite its relatively close proximity, it has a visual magnitude of only 9.9, making it too faint to be seen with the naked eye. It was named after Willem Lutyens, who, in collaboration with Edwin G. Ebbighausen, first determined its high proper motion in 1935. Luyten's star is one-quarter the mass of the Sun and has 35% of its radius. In March 2017, two planets were discovered orbiting Luyten's Star. The outer planet, GJ 273b, was a "Super Earth" with 2.9 Earth masses and found to be lying in the habitable zone, with potential for liquid water on the surface. The inner planet, GJ 273c, had 1.2 Earth masses, but orbited much closer, with an orbital period of only 4.7 days. In October 2017, a project known as "Sónar Calling GJ 273b" was initiated. This would send music through deep space in the direction of Luyten's Star in an attempt to communicate with extra-terrestrial intelligence. The project – organised by Messaging Extra-terrestrial Intelligence (METI) and Sónar (a music festival in Barcelona, Spain) – beamed a series of radio signals from a radar antenna at Ramfjordmoen, Norway. The first transmissions were sent on 16th, 17th, and 18th October, with a second batch in April 2018. This became the first radio message ever sent to a potentially habitable exoplanet. The message included 33 music pieces of 10 seconds each, by artists including Autechre, Jean Michel Jarre, Kate Tempest, Kode 9, Modeselektor and Richie Hawtin. Also included were scientific and mathematical tutorials sent in binary code, designed to be understandable by extra-terrestrials; a recording of an unborn baby girl's heartbeat; along with poetry and political statements about humans. Due to the lag from light speed over a distance of 70 trillion miles, the earliest possible date for a response to arrive back would be 2042. Depression is the number one global disease burden. When measured by years of life lost, depression has now overtaken heart disease to become the leading global disease burden. This includes both years lived in a state of poor health and years lost due to premature death. Principle causes of depression include debt worries, unemployment, crime, violence (especially family violence), war, environmental degradation, and disasters. The on-going economic stagnation around the world is a major contributing factor. However, progress is being made with destigmatising mental illness. Child mortality is approaching 2% globally. Childhood mortality, defined as the number of children dying under the age of five, was a major issue during the late 20th century. In 1970, more than 14% of children worldwide never saw their 5th birthday, while in Africa the figure was even higher at over 24%. The gap between rich and poor nations was staggering, with a mortality rate of only 24 per 1,000 live births in the most industrialised countries, an order of magnitude lower. Improvements in medicine, education, economic opportunity and living standards led to a fall in child deaths over subsequent decades. More and more children were being saved by low-tech, cost-effective, evidence-based measures. These included vaccines, antibiotics, micronutrient supplementation, insecticide-treated bed nets, improved family care and breastfeeding practices, and oral rehydration therapy. The empowerment of women, the removal of social and financial barriers to accessing basic services, new innovations that made the supply of critical services more available to the poor and increasing local accountability were policy interventions that reduced mortality and improved equity. The U.N.'s Millennium Development Goals included the ambitious target of reducing by two-thirds (between 1990 and 2015) the number of children dying under age five. While this goal failed to be met in time, the progress achieved was still significant – a drop from 92 to 43 deaths per 1,000 live births. Public, private, and non-profit organisations, keen to build on their experience and ensure the continuation of this trend, made childhood survival a focus of the new sustainable development agenda for 2030. A new objective was set, which aimed to lower the under-five mortality figure to less than 25 per 1,000 live births worldwide. With ongoing improvements in public health and education – aided by widespread access to the Internet in developing regions – this new goal was largely met, with further declines in childhood mortality from 2015 to 2030. Although some regions in Africa still have unacceptably high rates, the overall worldwide figure is around 2% by 2030. One recent development now having a major impact is the mass application of gene drives to control mosquito populations, greatly reducing the number of malaria cases. Huge advances have also been made in the prevention and treatment of HIV, which is no longer the death sentence it used to be. Some diseases have been eradicated by now including polio, Guinea worm, elephantiasis, river blindness, and blinding trachoma. However, the progress achieved in recent decades is now threatened by the worsening problems of climate change and other environmental issues, along with antibiotic resistance. Even discounting these emerging threats, it is simply impractical and impossible to prevent every childhood death with current levels of technology and surveillance. As such, childhood mortality begins to taper off – not reaching zero until much further into the future. The Muslim population has increased significantly. By 2030, the Muslim share of the global population has reached 26.4%. This compares with 19.1% in 1990. Countries which have seen the largest growth rates include Ireland (190.7%), Canada (183.1%), Finland (150%), Norway (149.3%), New Zealand (146.3%) the United States (139.5%) and Sweden (120.2%). Those which have experienced the biggest falls include Lithuania (-33.3%), Moldova (-13.3%), Belarus (-10.5%), Japan (-7.6%), Guyana (-7.3%), Poland (-5.0%) and Hungary (-4.0%). A number of factors have driven this trend. Firstly, Muslims have higher fertility rates (more children per woman) than non-Muslims. Secondly, a larger share of the Muslim population has entered – or is entering – the prime reproductive years (ages 15-29). Thirdly, health and economic gains in Muslim-majority countries have resulted in greater-than-average declines in child and infant mortality rates, with life expectancy improving faster too. Despite an increasing share of the population, the overall rate of growth for Muslims has begun to slow when compared with earlier decades. Later this century, both Muslim and non-Muslim numbers will approach a plateau as the global population stabilises. The spread of democracy and improved access to education are emerging as major factors in the slowing fertility rates (though Islam has yet to undergo the sort of renaissance and reformation that Christianity went through). Sunni Muslims continue to make up the overwhelming majority (90%) of Muslims in 2030. The portion of the world's Muslims who are Shia has declined slightly, mainly because of relatively low fertility in Iran, where more than a third of the world's Shia Muslims live. Full weather modelling is perfected. Zettaflop-scale computing is now available which is a thousand times more powerful than computers of 2020 and a million times more powerful than those of 2010. One field seeing particular benefit during this time is meteorology. Weather forecasts can be generated with 99% accuracy over a two-week period. Satellites can map wind and rain patterns in real time at phenomenal resolution – from square kilometres in previous decades, down to square metres with today's technology. Climate and sea level predictions can also be achieved with greater detail than ever before, offering greater certainty about the long-term outlook for the planet. Orbital space junk is becoming a major problem for space flight. Space junk – debris left in orbit from human activities – has been steadily building in low-Earth orbit for more than 70 years. It is made up of everything from spent rocket stages, to defunct satellites, to debris left over from accidental collisions. The size of space junk can reach up to several metres but is most often miniscule particles such as metal shavings and paint flecks. Despite their small size, such pieces of debris often sustain speeds of 30,000 mph – easily fast enough to deal significant damage to a spacecraft. Satellites, rockets, and space stations, as well as astronauts conducting spacewalks, have all had to cope with the increasing damage caused by collisions with these particles. One of the biggest issues with space junk is the fact that it grows exponentially. This trend, along with the increasing number of countries entering space, has made orbital collisions happen almost regularly in recent years. The newest space-faring nations have been particularly affected. Events similar to the 2009 collision of the US Iridium and Russian Cosmos satellites have raised fears of the so-called Kessler Syndrome. This scenario is where space junk reaches a critical mass, triggering a chain reaction of collisions until virtually every satellite and man-made object in an orbital band has been reduced to debris. Such an event could destroy the global economy and render future space travel almost impossible. By 2030, the amount of space junk in orbit has tripled, compared to 2011. Countless millions of fragments can now be found at various levels of orbit. A new generation of shielding for spacecraft and rockets is being developed, along with tougher and more durable space suits for astronauts. This includes the use of "self-healing" nanotechnology materials, though expenses are too high to outfit everything. Larger chunks of debris have also been impacting on Earth itself more frequently. Though most land in the ocean (since the planet's surface is 70% covered by water), a few crashes on land, necessitating early warning systems for people in the affected areas. Increased regulation has begun to mitigate the growth of space debris, while better shielding and repair technology has reduced the frequency of damage. Increased computing power and tracking systems are also helping to predict the path of debris and instruct spacecraft to avoid the most dangerous areas. Options to physically move debris are also being deployed – including nets and harpoons fired from small satellites, along with ground-based lasers that can push junk into decaying orbits, so it burns up in the atmosphere. Despite this, space junk remains an expensive problem for now. Jupiter Icy Moon Explorer (JUICE) reaches the Jovian system. Jupiter Icy Moon Explorer (JUICE) is a mission by the European Space Agency (ESA) to explore the Jovian system, focussing on the moons Ganymede, Callisto and Europa. Launched in 2022, the craft goes through an Earth-Venus-Earth-Earth gravity assist, before finally arriving at Jupiter in 2030. JUICE initially studies Jupiter's atmosphere and magnetosphere, gaining valuable insight into how the gas giant might have originally formed. For its primary objective, the probe performs a series of flybys around some of the largest Galilean moons. Ganymede, Callisto and Europa are focussed on since all are believed to hold subsurface liquid water oceans. JUICE records detailed images of Callisto (which has the most heavily cratered surface in the Solar System), while also taking the first complete measurements of Europa's icy crust and scanning for organic molecules that are essential to life. In 2033, the probe enters orbit around Ganymede for the final phase of the mission. The detailed study includes Characterisation of the different ocean layers, and detection of sub-surface water reservoirs. Topographical, geological, and compositional mapping of the surface. Confirmation of the physical properties within the icy crust. Characterisation of internal mass distribution. Investigation of the exosphere (a tenuous outer atmosphere). Study of Ganymede's intrinsic magnetic field and its interactions with the Jovian magnetosphere. Determining the moon's potential to support life. This final stage of the mission provides a vast wealth of empirical data. When combined with new information from Callisto and Europa, it generates an extremely detailed picture of the Galilean moons. JUICE also studies possible locations for future surface landings. Indeed, various plans are now underway to further explore the Jovian system, with mission capabilities being enhanced by a new generation of cheaper and reusable rockets. This includes sample return missions and the first lander to drill down and explore the subsurface liquid oceans. The UK space industry has quadrupled in size. In 2010, the UK government established the United Kingdom Space Agency (UKSA). This replaced the British National Space Centre and took over responsibility for key budgets and policies on space exploration – representing the country in all negotiations on space matters and bringing together all civil space activities under one single management. By 2014, the UK's thriving space sector was contributing over £9 billion ($15.2 billion) to the economy each year and directly employing 29,000 people, with an average growth rate of almost 7.5%. Recognising its strong potential, the government backed plans for a fourfold expansion of the industry. New legal frameworks allowed a spaceport to be established in the UK – triggering growth of space tourism, launch services and other hi-tech companies. By 2030, the UK has become a major player in the space industry, with a global market share of 10%. Having quadrupled in size, its space industry now contributes £40 billion ($67 billion) a year to the economy and has generated over 100,000 new high-skilled jobs. The UK has significantly increased its leadership and influence in crucial areas like satellite communications, Earth observation, disaster relief and climate change monitoring. The growth of space-based products and services means the UK is now among the first 100% broadband-enabled countries in the world. This has also reduced the costs of delivering government services to all citizens, regardless of their location. The Lockheed Martin SR-72 enters service. The SR-72 is an unmanned, hypersonic aircraft intended for intelligence, surveillance, and reconnaissance. Developed by Lockheed Martin, it is the long-awaited successor to the SR-71 Blackbird that was retired in 1998. The plane combines both a traditional turbine and a scramjet to achieve speeds of Mach 6.0, making it twice as fast as the SR-71 and capable of crossing the Atlantic Ocean in under an hour. A scaled demonstrator was built and tested in 2018. This was followed by a full-size demonstrator in 2023 and then entry into service by 2030. The SR-72 is similar in size to the SR-71, at approximately 100 ft (30 m) long. With an operational altitude of 80,000 feet (24,300 metres), combined with its speed, the SR-72 is almost impossible to shoot down. Half of America's shopping malls have closed. For much of the 20th century, shopping malls were an intrinsic part of American culture. At their peak in the mid-1990s, the country was building 140 new shopping malls every year. But from the early 2000s onward, underperforming, and vacant malls – known as "Grey field" and "dead mall" estates – became an emerging problem. In 2007, a year before the Great Recession, no new malls were built in America, for the first time in half a century. Only a single new mall, City Creek Centre Mall in Salt Lake City, was built between 2007 and 2012. The economic health of surviving malls continued to decline, with high vacancy rates creating an oversupply glut. A number of changes had occurred in shopping and driving habits. More and more people were living in cities, with fewer interested in driving and people in general spending less than before. Tech-savvy Millennials (also known as Generation Y), in particular, had embraced new ways of living. The Internet had made it far easier to identify the cheapest products and to order items without having to be physically there in person. In earlier decades, this had mostly affected digital goods such as music, books, and videos, which could be obtained in a matter of seconds – but even clothing was eventually possible to download, thanks to the proliferation of 3D printing in the home. Many of these abandoned malls are now being converted to other uses, such as housing. Emerging job titles of today. Some of: Alternative Vehicle Developer, Avatar Manager / Devotee, Body Part Maker, Climate Change Reversal Specialist, Memory Augmentation Surgeon, Nano Medic, Narrowcaster, 'New Science' Ethicist, Old Age Wellness Manager / Consultant Specialist, Quarantine Enforcer, Social 'Networking' Officer, Space Pilot / Orbital Tour Guide, Vertical Farmer, Virtual Clutter Organizer, Virtual Lawyer, Virtual Teacher, Waste Data Handler. 8K VR headsets are common. 8K displays (amounting to 33 MP of resolution per eye) are a fairly standard feature of virtual reality (VR) in 2030. These offers quadruple the pixel count of the best consumer VR products from a decade earlier. Following a long period with little or no activity, the VR industry saw a major revival from around 2015 onwards. A prototype of the Oculus Rift and its subsequent commercial release led to dozens of competitors within a few years, including models with better resolution and fields of view (FOV). Having initially been a somewhat expensive and niche form of entertainment, VR declined greatly in cost during the 2020s. The COVID-19 pandemic accelerated its mainstream adoption. Between 2021 and 2028, the industry experienced a compound annual growth rate of 18%. Virtual, augmented, and mixed realities together produced $1.5 trillion in net economic benefits by the end of the decade. By 2030, the quality of VR has improved exponentially. The latest screens now provide breathtaking detail and realism, ultra-low latency, and wide FOV, while a variety of new features are combining to enhance the level of immersion and interactivity still further. For example, most headsets now include as standard the option for a brain-computer interface (BCI) to record users' electrical signals, enabling actions to be directed by merely thinking about them. Such technology had already begun to emerge some years previously, but has now improved greatly in terms of speed, accuracy, and ubiquity. Non-invasive sensors placed on the scalp are by far the preferred choice for mainstream BCI use. However, more advanced options for invasive BCIs have now begun to emerge, as the technology shifts from purely clinical uses (such as treating paralysis) and into business, leisure, and entertainment. Although still at a niche and experimental phase of development, early adopters willing to undergo surgery and have electrodes touching the surface of their brain can use bidirectional links for both reading and writing information to their neocortex. In VR gaming, these more invasive BCIs can increase the level of immersion, tricking the senses in ways that bring a player closer to the action. New visual, auditory, and tactile sensations are made possible by stimulating both the motor and visual cortex. These effects are rather limited at this stage and exploited by only the most hardcore gamers – but provide more lifelike ways of interacting with simulated people, objects, and environments. This decade sees much progress with BCI technology as the number of electrodes used in the implants grows by leaps and bounds, enabling larger and more complex brain patterns to be recorded and decoded. In addition to gaming, BCIs gain popularity from the enhancement of wellness functions, such as for guided meditation and the improvement of sleep quality. At the same time, ethical issues are emerging over consent, privacy, identity, and agency, especially when BCIs are combined with AI. 100 terabyte HDDs reach consumer level. As of 2020, consumer-level hard disk drives (HDDs) featured capacities up to a maximum of about 18 terabytes (TB). While solid state drives (SSDs) had been available with greater storage sizes for enterprise-grade users, as well as faster transfer speeds, traditional HDDs remained a relevant and attractive alternative thanks to their much lower costs. With perpendicular magnetic recording (PMR) approaching its limits, even boosted with two-dimensional magnetic recording (TDMR), the capacity of HDDs had seen a reduced rate of growth in the 2010s. However, an innovative technique known as heat assisted magnetic recording (HAMR) allowed data to be written to much smaller spaces. This provided a major acceleration in storage capacities during the 2020s, creating a new paradigm for HDD with successive generations now able to jump in larger steps of 4TB, 6TB, or even 10TB at a time. Although data densities continued to grow rapidly, transfer speeds became an increasingly important consideration. Vast storage volumes needed to be accessible at rates commensurate with their size. To boost the IOPS per TB performance of HDDs, multi-actuator drives began to emerge. For example, using two actuators instead of one could almost double throughput. With 50TB consumer-level HDDs on sale in 2026, hard drive makers continued to innovate and find ways of making data both smaller and faster, driven by the world's ever-growing demand for storage. By 2030, consumer PC users have access to 100TB HDDs, quintupling the figure of a decade earlier. Completion of Saudi Vision 2030. This year sees the realisation of a long-term strategic framework by Saudi Arabia, intended to reduce the country's dependence on oil, diversify its economy and develop public service sectors such as education, health, infrastructure, recreation, and tourism. The key goals of "Saudi Vision 2030" include reinforcing economic and investment activities, increasing non-oil international trade, and promoting a softer and more secular image of the Kingdom. Crown Prince Mohammad bin Salman first announced the vision in 2016. The Saudi Council of Economic and Development Affairs (CEDA) then began identifying and monitoring the steps crucial for implementation by 2030. The CEDA established 13 programs, called Vision Realisation Programs, covering the different areas such as energy, finance, housing, quality of life, and transport. The Kingdom's goals for 2030 included: To move Saudi Arabia from the 19th largest economy in the world into the top 15, To increase non-oil government revenue from SAR 163 billion (US$43.5bn) to at least SAR 1 trillion (US$267bn), To increase women's participation in the workforce from 22% to 30%, To increase small and medium-sized enterprise (SME) contribution to GDP from 20% to 35%, To increase the private sector's contribution from 40% to 65% of GDP, To increase household spending on cultural and entertainment activities inside the Kingdom from the current level of 2.9% to 6%, To increase the ratio of individuals exercising at least once a week from 13% of the population to 40%, To increase the average life expectancy from 74 years to 80 years, To have three Saudi cities be recognised in the top 1% of cities in the world, To more than double the number of Saudi heritage sites registered with UNESCO. Alongside these socioeconomic measures, proposals for various large-scale projects began to emerge. The developers intended to both improve the country's domestic transport and infrastructure, and to showcase Saudi Arabia to the world as a destination for leisure and investment. They included new retail, hotel, entertainment, cultural and residential megaprojects, as well as industrial, logistics, and corporate facilities. By far the costliest and most prominent took the form of a $500bn smart city in the north-western corner of the country. Known as Neom – a portmanteau of the Greek word Neos, meaning "new," and mustaqbal, the Arabic word for "future" – this would operate independently from the existing governmental framework, with its own tax and labour laws and an autonomous judicial system. According to its developers, Neom would be "a hub for innovation where global business and emerging players can research, incubate and commercialise ground-breaking technologies to accelerate human progress." In 2021, Saudi Crown Prince Mohammed bin Salman unveiled the first major development within the Neom zone, a planned city named "The Line". The Line (as its name suggested) would consist of a long, linear development stretching for over 170 km (105 miles). This huge, continuous urban belt would enable the Red Sea coastline to the west to be linked with mountains and upper valleys in the east. The developers intended for it to redefine the traditional layout of a city by emphasising a strong focus on nature, liveability, health, and community connections. The Line's masterplan called for building "around nature, rather than over it" and specified large areas of land to be preserved for conservation. The need for cars and other vehicles would be eliminated, with all essential daily needs provided within a five-minute walk for every resident. The project would include a system of ultra-high-speed mass transit running its complete length, with businesses and communities also hyper-connected through a digital framework incorporating artificial intelligence (AI) and robotics. The AI would monitor the city, using predictive and data models to optimise daily life for citizens in various ways. The Line would be self-sufficient with locally grown food, powered by 100% clean energy, home to abundant parks and other green spaces, and with sustainable and regenerative practices adopted throughout the city. The developers completed phase one of The Line by 2025. Following subsequent expansion, its population has reached over a million by 2030, and the city has now boosted Saudi Arabia's economy by SAR 180 billion (US$48bn). In addition to advanced technologies, The Line boasts other features. Its location makes it favourable in terms of weather and climate conditions, being one of the few areas in Saudi Arabia to experience snowfall in winter, as well as benefitting from the ocean breeze and aquatic recreation opportunities. Temperatures are 10°C lower than the average for the Arabian Peninsula. As a further geographic advantage, it can also be reached by more than 40% of the world's population in less than a four-hour flight, while 13% of global trade already flows through the Red Sea. The Line serves as a model for future developments within the Neom zone, while also inspiring other large-scale infrastructure projects both in Saudi Arabia and around the world. Cargo Sous Terrain becomes operational in Switzerland. The Cargo Sous Terrain is an underground, automated system of freight transport that becomes operational in Switzerland from 2030 onwards. It is designed to mitigate the increasing problem of road traffic, which has grown by 45% in the region since the mid-2010s. This tube network, including the self-driving carts and transfer stations, is built at a cost of $3.4 billion and is privately financed. The entire project is powered by renewables. An initial pilot tunnel is constructed 50 metres below ground, with a total length of 41 miles (66 km). This connects Zurich, the largest city in Switzerland, with logistics centres near Bern (the capital) in the west. The route includes four above-ground waystations that enable cargo transfers. The pilot tunnel is followed by an expanded network that links Zurich with Lucerne and eventually Geneva, spanning the entire width of the country. The unmanned, automated vehicles are propelled by electromagnetic induction and run at 19 mph (30 km/hour), operating 24 hours a day. An additional monorail system for packages, in the roof of the tunnel, moves at twice this speed. The Cargo Sous Terrain allows goods to be delivered more efficiently, at more regular intervals, while cutting air and noise pollution, as well as reducing the burden of traffic on overground roads and freight trains. The entire ocean floor is mapped. While humans had long ago conquered the Earth's land masses, the deep oceans lay mostly unexplored. In the early years of the 21st century, only 20% of the global ocean floor had been mapped in detail. Even the surfaces of the Moon, Mars and other planets were better understood. With data now becoming as important a commodity as oil, researchers set out to acquire knowledge of the remaining 80% and uncover a potential treasure trove of hidden information. Seabed 2030 – a collaborative project between the Nippon Foundation of Japan and the General Bathymetric Chart of the Oceans (GEBCO) – aimed to bring together all available bathymetric data to produce a definitive map of the world ocean floor by 2030. As part of the effort, fleets of automated vessels capable of trans-oceanic journeys would cover millions of square miles, carrying with them an array of sensors and other technology. These uncrewed ships, monitored by a control centre in Southampton, UK, would deploy tethered robots to inspect points of interest all the way down to the floor of the ocean, thousands of metres below the surface. By 2030, the project is largely complete. The maps provide a wealth of new information on the global seabed, revealing its geology in unprecedented detail and showing the location of ecological hotspots, as well as many shipwrecks, crashed planes, archaeological artefacts, and other unique and interesting sites. Commercial applications include the inspection of pipelines, and surveying of bed conditions for telecoms cables, offshore wind farms and so on. However, concerns are raised over the potential impact of new undersea mining technology, the opportunities for which are now greatly increased.

2031: DAVINCI+ lands on Venus. DAVINCI+ is a robotic spacecraft sent to investigate the atmosphere of Venus, as part of NASA's Discovery Program. Having launched in 2029 and performed a series of flybys, it deploys a lander in 2031 – the first since NASA's Pioneer Venus in 1978 and the USSR's Vega in 1985. During its 63-minute descent by parachute, this sphere-shaped probe samples the air, taking measurements of its chemical composition and other properties down to the surface. This data, 10 times more detailed than any previous mission, helps in understanding the atmosphere's origin, how it has evolved over time, and how and why it is different from those of Earth and Mars. It determines the history of water on Venus, such as the presence of oceans in the past, as well as previously unknown processes at work in the lower atmosphere. Before it reaches the surface, the DAVINCI+ probe takes the first ever high-resolution photos of the planet's intriguing, ridged terrain ("tesserae") to explore its origin alongside tectonic, volcanic, and weathering history. This camera, the Venus Descent Imager (VenDI), is similar to the Mars Descent Imager (MarDI) and reveals the surface of Venus in stunning, never-before-seen detail. The probe's other instruments include two spectrometers to obtain the first detailed in situ surveys of noble and trace gases and associated isotope ratios, alongside an "atmospheric structure investigation" to measure the dynamics of the Venusian atmosphere during entry and descent. All of these findings help scientists understand why Venus and Earth took such different paths as they matured, providing another point of comparison for studies of rocky exoplanets. A sister mission, VERITAS, is also taking place at Venus during this time, with a focus on mapping and geology. India has begun phasing out hydrofluorocarbons (HFCs). The Montreal Protocol, signed in 1987, was an international agreement to protect the ozone layer by phasing out the production of substances responsible for ozone depletion. These included chlorofluorocarbons (CFCs) and hydrochlorofluorocarbons (HCFCs). CFCs and HCFCs had been used in a wide range of industrial applications – such as aerospace, agriculture, air conditioning, electronics, fire protection, flexible and rigid foam, laboratory measurements and refrigeration. Produced mostly in developed countries, hydrofluorocarbons (HFCs) began to replace CFCs and HCFCs. HFCs posed no harm to the ozone layer because, unlike CFCs and HCFCs, they did not contain chlorine. They were, however, greenhouse gases, with a very high global warming potential (GWP), thousands of times greater than carbon dioxide (CO2) when measured on a per-molecule basis. While their atmospheric concentration was initially very low, it began to grow rapidly in the years following the Montreal Protocol. Known as "super greenhouse gases," it was feared that soaring use of HFCs, combined with their high global warming potential, could undercut the benefits expected from the reduction of other greenhouse gases such as carbon dioxide and methane. If left unchecked, it was calculated that HFCs could add a potentially disastrous 0.5°C to global average temperatures by the end of the 21st century. Unlike other greenhouse gases in the 2015 Paris Agreement, HFCs had other international negotiations. In September 2016, the New York Declaration urged a global reduction in the use of HFCs. In October 2016, negotiators from 170 nations meeting at the summit of the UN Environment Programme in Kigali, Rwanda, reached a legally-binding accord to phase out hydrofluorocarbons (HFCs) in an amendment to the original Montreal Protocol. While hailed as a major achievement for international diplomacy, there were significant differences on the timing and schedule of reductions. The final agreement would see the richest countries starting the process in 2019. Over 100 developing nations would follow in 2024. However, a small group of countries argued for and secured a later start, insisting their economies needed time to grow. Among this group was India, the world's third largest emitter of greenhouse gases and a region with surging demand for air conditioning, due to both rising incomes and increasingly extreme hot weather. The chemicals needed to replace HFCs were more flammable and toxic – requiring better-trained and better-paid people to design, install and maintain equipment such as air conditioners safely and correctly. In addition, new technologies were required to capture and store HFC in some applications. India's Council on Energy, Environment and Water estimated that the total cost of phasing out HFCs in all homes, workplaces and vehicles could reach up to $38 billion. India, along with Pakistan and a number of Gulf states, agreed to "freeze" their use of HFCs by 2028. This plateau would be followed by steep reductions from 2031 onwards, leading to the vast majority of HFCs being eliminated by the late 2040s. Much of Bangkok is being abandoned due to flooding. Bangkok, with a population of over 12 million, has been sinking underwater for decades. By the early 2030s, it is facing a disaster of epic scale, with much of the city being abandoned. This has occurred for various different reasons. First and foremost, the city is built on clay. When originally settled, the region was just swampy coastline, but today it is covered by skyscrapers, highways, and urban development. The enormous weight of all this concrete and steel has been pushing down on the soft clay beneath, causing the soils to descend by up to 5.3cm per year. By 2010, part of the megalopolis was already under sea level, a trend that would only become worse in the following decades. The illegal tapping of groundwater has been another major factor. Many of the city's residents have been continuously pumping up groundwater – both for their own use and to sell as a commodity – removing a natural layer and resulting in further destabilisation of the soil. Rising sea levels due to global warming have been yet another factor, eroding the coastline at a rate of 4cm a year, while the increasing severity of monsoon rains has led to longer and more devastating floods. The explosive growth of Bangkok in recent decades (making it one of the fastest growing places in southeast Asia) has dealt a serious blow to the city's infrastructure. Areas of land that had in the early 20th century been used to absorb flood waters had vast suburbs and business districts built over them. Canals were filled in to make way for the rapid urbanisation of the Chao Praya River Delta. The weight of the city grew and grew, to the point where the soft soil it was built upon could simply no longer support it. By the early 2030s, large portions of the megalopolis are well below sea level. The government's response during this time has proven inadequate, a lack of clear policy doing little to help the overall situation, while sea walls have been almost useless due to increasing erosion of the shore. The lowering of the city, combined with rising sea levels (over 20cm higher than in 2000), has resulted in whole districts of Bangkok being permanently abandoned. Over a million buildings, the majority residential, are rendered uninhabitable, forcing their occupants to move further inland. Many areas which have yet to be fully claimed by the sea have also been evacuated, as the regularity of flooding proved too costly for many. Shantytowns and refugee camps are forming outside the city, while the government struggles to adjust as the capital sinks. Thailand as a whole is going through a period of almost unimaginable stress at this time, a result of such huge population displacement. The political, economic, and social upheaval in the region is having a significant impact on global GDP. Efforts are underway to save Bangkok's numerous historical monuments and artifacts, with some temples being moved inland and reconstructed in their entirety. Due to the scale of this disaster, however, much is lost. In the coming years, the situation for Bangkok will only worsen as more and more of the city is permanently flooded. By the end of this century, the entire city will be abandoned. Global reserves of lead are running out. Lead is a carbon group element with high density, malleability, softness, and ductility. Metallic lead is relatively rare in the Earth's crust, and so is usually processed from zinc, silver, and copper ores. Like silver, lead has been in use by humans for thousands of years. It was widely exploited by the Roman empire and played a large role in the industrial revolution. World production doubled from 1850 to 1900, doubled again from 1900 to 1950, then doubled yet again from 1950 to 2000. Due to its high density, it has often been used as a weight or ballast, as well as radiation shielding. It is also used in firearms and other weaponry. The bulk of lead is used in producing car batteries and similar, as well as in electrodes and high voltage wires. The primary producers are China, Australia, the United States, Canada, and Kazakhstan. Lead is also a pollutant and can be hazardous to human health, being infamous for its older uses in paint and fuels. From the 18th to 21st centuries, environmental levels of lead increased more than 1,000-fold. In terms of numbers of people exposed and the public health impact, it became one of the largest environmental medicine problems. Although regulations from the 1970s began to reduce the lead content in products and greatly cut exposure in the developed world, many developing countries still allowed its use. By the early 2030s, most reserves of lead are beginning to be exhausted. Much of the recent increase in demand has come from China's growing automobile sector. Because about half of the supply comes from recycled scrap, improved recycling programs are able to carry demand in the short term. Fortunately, lead has ready alternatives for most of its uses including zinc, copper, iron, and tungsten. However, some of these metals will soon be facing their own shortages too, necessitating the production of artificial replacements. Because of this, the 2030s sees an acceleration of global recycling efforts in order to avert a resource crunch. Perennial wheat and corn are becoming profitable. In traditional agriculture, all major grain crops have been "annuals", or short-lived perennials grown as annuals – surviving for just one growing season. They die off once harvested, and then a brand-new crop must be planted to take their place. This requires vast amounts of fuel, fertilizer, herbicides, and pesticides – causing soil erosion, acidification and disrupting both the nitrogen and carbon cycles. Between 1600 and 2000 AD, the United States lost around one-third of its topsoil. Worldwide, soil erosion was putting the livelihoods of nearly a billion people at risk by the early 21st century. More than a quarter of Earth's land surface had been converted for agricultural use, with more land converted since 1950 than in the previous 150 years. This situation was being exacerbated by rapid population growth, demand for meat products in emerging economies, increased production of biofuels, along with climate change and peak phosphorus looming on the horizon. Genetic engineering had emerged as an important tool in crop management. Among the more notable advances occurred in 2017, when Bio Cassava Plus received regulatory approval, giving a huge boost to farming in Africa. By the early 2030s, this has been followed by an even greater breakthrough – perennial wheat and corn – becoming profitable after many years of development. Perennial grain crops provide a revolution in agriculture. By growing continuously for two or more years, they are far more efficient than traditional annual crops, requiring less fuel, fertilizer, herbicides, and pesticides. They can store more carbon, maintain better soil quality, and water content, and manage nutrients better, thanks to their deeper root systems. The deciphering of bread wheat's genetic code in the early 2010s helped in paving the way towards a new generation of perennial crops. The rate of increase in wheat yields, having been in decline since 1980, is now beginning to increase again. However, agriculture will face a new set of challenges later this decade as the effects of climate change begin to accelerate markedly. Web 4.0 is transforming the Internet landscape. Further convergence of the online and physical worlds has led to the emergence of "Web 4.0" – the next generation of internet. Semantic analysing programs, having evolved into stronger AI, now perform a huge range of automated tasks for business, government, and consumers. Running on massively parallel networks, these applications hunt for textual and visual data – combining the most subtle capabilities of humans (such as pattern recognition) with ways in which machines are already vastly superior (such as speed and memory). In addition to serving as highly advanced search engines, they are playing a major function in the real world – gathering information from the array of sensors, cameras and other tracking devices now present in the environment, on vehicles, and even on people themselves. Although privacy and civil liberties issues are being raised, this new generation of IT promises to bring enormous benefits to society. Crimes are faster and easier to solve thanks to these intelligent virtual agents; transport and logistics are smoother and more efficient; resources can be managed and distributed more accurately. In addition, practically every physical document in existence has now been digitally encoded, backed up and archived online. This includes full copies of all books, journals, manuscripts, and other literature ever published – forming a complete repository of human knowledge going back thousands of years. These documents can be retrieved and analysed using real-time speech commands, translated from any of the world's 6,000 languages and accessed via 3D holographic imaging. Web 4.0 is also democratising the Internet more than ever before. News agencies are finding themselves increasingly outmoded by bloggers and other social media when it comes to speed and accuracy of information. Stem cell pharmacies are commonplace. Stem cell pharmacies are now a fairly common experience in the developed world, offering walk-in diagnosis, stem-cell collection, and banking services for use in future medical crises. Affordable, personalised, and targeted treatments are becoming available for regenerating various body parts and organs. Married couples are a minority in the UK. By now, marriage in the UK has been reduced to a lifestyle choice enjoyed by a minority, rather than an essential institution of society. The married population has shrunk from almost 50% of adults in 2009, to just 41% now. This trend began in the 1980s. Increasing pressures of work and money, together with the general stresses of the outside world (geopolitical, social, and economic), are putting ever-greater strain on couples. The decline of religious institutions has also played a part. Unmarried partnerships no longer carry the stigma they once had. In addition, increasing numbers of people either working at home alone, or living with their parents, are making it difficult for some to meet potential partners. Another contributory factor is an explosion in the use of virtual reality and other technologies leading to increased isolation of the individual. People of all ages spend increasingly large amounts of their time engaged in highly immersive online experiences, requiring little or no interaction with the outside world. Of those who are married, the number of children per couple has declined – and not just in the UK, but other Western societies too. Combined with increasing numbers of Muslim immigrants (who have higher numbers of children), this is significantly altering the demographic balance. Chocolate has become a rare luxury. By now, chocolate has become as rare and expensive as caviar, with even a single bar costing $10-15. Drought, soil depletion and diminishing harvests in Africa – where two-thirds of the world's cocoa is produced – have led to soaring prices. Cocoa is also competing for agricultural space with other commodities like palm oil, which is increasingly in demand for biofuels. Poor pay and working conditions have also been a factor. Many young farmers are now abandoning their lands and heading to the cities, in search of better and more highly paid jobs.

2032: A giant artificial island project opens in Hong Kong. Lantau Tomorrow Vision is a HK$500 billion (US$64 billion) megaproject in Hong Kong, developed in response to overcrowding and sky-high property prices. Located on reclaimed land covering an area of 1,700 hectares, it creates a third core business district – plus residential units for 1.1 million people, of which 70% is public housing. In total, this generates 340,000 new jobs. A series of giant artificial islands are constructed between Lantau (in the west) and Hong Kong Island (in the east). The project is the largest of its kind ever attempted in Hong Kong and there is great controversy surrounding its cost. However, the initial outlay is eventually recouped, with land revenue ultimately reaching almost HK$1 trillion (US$128 billion). Approximately half of the development costs go into transport and infrastructure. The project also benefits from close proximity to the world's longest sea bridge, completed in 2018, which connects Hong Kong with Macau further to the west. This allows it to serve as a gateway between the two autonomous regions. The new islands are protected against climate change. They are elevated as high as the nearby Hong Kong International Airport – which opened in 1998 and also sits on reclaimed land – six metres (19 feet) above sea level and able to withstand super typhoons. Construction begins in 2025, with the first residents moving in by 2032. The majority of UK homes are rented. By 2032, house prices in the UK have become so unaffordable that the majority of people are forced to take rented accommodation. This trend first began to emerge during the Blair years of the late 90s and early 2000s. It could be argued, however, that the problem originated as far back as 1980, when the Conservative government led by Margaret Thatcher passed the Housing Act. This led to a fall in socially rented housing – as council tenants were given the legal right to buy, at a large discount, the home they were living in – while councils were prevented from reinvesting most of the proceeds from these sales into building new homes. Many tenants who had purchased these council flats later profited from them as buy-to-let landlords – effectively subsidised by taxpayers – or sold them to speculators, investors, and property firms. About 1.5 million council homes were sold by 2003 and this figure had reached 2 million by 2015. A failure to construct enough new homes, combined with rapid population growth (especially from immigration), resulted in a serious lack of supply during the early 21st century. Other factors included changes in the employment landscape, a rise in the number of students, later marriage, and rising separation rates. Having been relatively stable for most of the 20th century, the average cost of a UK home rose from £50K in 1995 to £184K by 2007. During this same period, mortgage payments as a percentage of income soared from 18% to more than 50%. The problem was compounded by stagnant wage growth (below inflation), a decline in the level of household savings (from 16% in the early 90s, to just 6% within two decades) and tighter lending requirements in the aftermath of the Great Recession. Subsequent attempts to rectify the situation included policies such as the "Help to Buy" scheme, but these only exacerbated the problem by creating artificially inflated demand. The fundamental issue was lack of supply – but government funding and policies came nowhere near close enough to addressing this point with only a tinkering around the edges to boost housing stock. Because of these failures, less than half – 49% – of UK households are homeowners by 2032 – the first time since the early 70s that a majority of people are renting. One-third of households are now renting privately, twice as many as in 2015. London and the southeast of England have been particularly affected, due to a massive influx of foreign billionaires pouring money into the region and pushing up land values. The gap between rich and poor – and between the younger and older generations – has now grown to be wider than ever, creating an ever more polarised and unequal society. Britain's ash trees have been wiped out by a fungal disease. Ash dieback, caused by the fungus Chalara fraxinea, killed huge numbers of trees from the mid-1990s onwards, particularly in eastern and northern Europe. Up to 90% of ash trees were affected in Denmark. The fungus was first scientifically described in 2006. It was discovered in the UK during 2012, initially only on imported nursery stock, but in October of that year it was found on trees at two sites of established woodland in the East Anglia region. This occurred despite clear warnings from ecologists and foresters that imports of seedlings from the continent should be banned in case of infection. Despite efforts to contain the disease, it was impossible to stop. Within a few weeks, Chalara fraxinea was confirmed in dozens of other locations. Over the next two decades, it spread throughout the country, wiping out most of the 90 million ash trees in Britain. Many plant species, birds, and other animals dependent upon the trees for survival were also lost, at a time when their numbers were already in sharp decline. With ash trees forming a significant proportion of the UK's woodland, an eerie silence is descending on many areas of countryside, with birdsongs and other wildlife becoming ever rarer. Leatherback sea turtles are on the verge of extinction. Growing up to seven feet (two metres) long and exceeding 2,000 pounds (900 kilograms), leatherbacks are the largest turtles on Earth. They can dive to depths of nearly 4,000 feet and make trans-Pacific migrations from Indonesia to the U.S. Pacific coast and back again. These ancient reptiles are the only remaining members of a family of turtles with evolutionary roots going back 100 million years. After mating at sea, females come ashore during the breeding season to nest. At night, they excavate a hole in the sand, depositing around 80 eggs. This is filled with sand, making detection by predators difficult, before the mother turtle returns to the sea. Once common throughout the world, their population declined rapidly during the 20th century and into the 21st. At the Jamursba Medi Beach in Papua Barat, Indonesia – accounting for 75 percent of total sightings in the western Pacific – nest numbers plummeted from a peak of 14,455 in 1984 to a low of 1,532 in 2011. Several major problems faced leatherback turtles: nesting beach predators, such as pigs and dogs that were introduced to the islands, eating the turtle eggs; rising sand temperatures that killed the eggs or prevented the production of male hatchlings; the danger of being caught by fisheries during migrations; and harvesting of adults and eggs for food by islanders. Plastic debris, mistaken for their favourite food (jellyfish) was another problem. Some individuals were found to have ingested almost 11 pounds (5 kilograms) of plastic into their stomachs. China's space station is deorbited. China's first space station has reached the end of its 10-year lifespan. After a decade of onboard research, it is abandoned and sent into a decaying orbit. A new, larger, and more advanced space station is now in the process of being constructed. 4th generation nuclear power. By this date, 4th generation nuclear power plants are commercially available. They utilise a system of small balls, rather than large fuel rods. They are a major improvement over previous generations, for the following reasons: It is physically impossible for them to have a runaway chain reaction, as happened with Chernobyl. No error, human or otherwise, will ever produce a meltdown. The uranium fuel is only 9% enriched. This makes it impossible to be used in terrorist nuclear weapons. The nuclear waste is much easier to dispose of. They are highly economical. Electricity can be generated more cheaply than oil or gas power, even when the decommissioning costs of the stations are considered. For these reasons, nuclear power becomes a lucrative industry from the 2030s onwards. China and India, in particular, take advantage of this enhanced power source. Solar and wind power has greater long-term potential, however, due to the finite supply of uranium. One-third of Saudi Arabia's electricity comes from solar. In 2012, the Kingdom of Saudi Arabia had only 0.003 gigawatts (GW) of installed solar energy capacity. More than half of its electricity was created from burning oil. By 2032, however, it has 41 GW of installed solar energy, accounting for a third of the nation's 121 GW total energy demand. About 25 GW is produced by solar thermal plants, which use mirrors to focus energy from the sun on heating fluids, which in turn run turbines. The other 16 GW is provided by massive photovoltaic farms. This has been a result of considerable foreign investment, as well as the wealth produced by fossil fuels, totalling over $100 billion. Though several other nations have more extensive solar infrastructure, this has been one of the most ambitious projects, especially considering Saudi Arabia's old position as the world's largest exporter of crude oil. The enormous expanses of desert, as well as measurably more intense sunlight in the equatorial regions, gives the country a huge amount of room to expand further. Even larger projects are now planned. Construction is also underway on high voltage cables connecting Saudi Arabia to neighbouring countries and some in Southern Europe. Eventually, this will be expanded to include all of Europe and Northern Africa. Alongside solar, another 21 GW is generated by a combination of nuclear, wind, and geothermal power. Through nuclear cooperation agreements with China, France, South Korea and Argentina, Saudi Arabia has now constructed 16 new nuclear reactors. Longer term, the country has further ambitions to be powered entirely by renewables. Much of the oil industry in Saudi Arabia is now transitioning away from energy production to the manufacture of plastics and polymers. These developments, coupled with the fact that it remains one of the wealthiest countries in the region, are helping Saudi Arabia transition to a more sustainable long-term future. Additionally, its strong military, ties to the West, and extensive desalination infrastructure have allowed Saudi Arabia to remain relatively stable compared to some of its neighbours. Terabit internet speeds are commonplace. In addition to the benefits resulting from Web 4.0, connection speeds have also vastly improved. Bandwidth has been growing by roughly 50% each year. Many homes and offices in the developed world now have a terabit connection. Some of these connections are now appearing on people themselves, in the form of implantable devices. Britain upgrades its nuclear-armed submarine fleet. Britain was the third country (after the U.S. and the Soviet Union) to test an independently developed nuclear weapon, in 1952. From 1969 onwards the country always had at least one ballistic-missile submarine on patrol, creating a nuclear deterrent that the Defence Council described in 1980 as "effectively invulnerable to pre-emptive attack". Until the 1990s, Britain had deployed a variety of nuclear weapons on Royal Navy carriers, V bombers and other aircraft around the world. However, these were gradually withdrawn. The retirement of the WE.177, in both air-dropped free-fall and depth charge versions, was the final stage of this decommissioning process, in 1998. This left a group of four Vanguard submarines (armed with Trident II D5s) as Britain's only nuclear weapons platform. Britain retained a stockpile of about 215 thermonuclear warheads, with 120 operational as of 2016. A decision to renew the Trident-armed submarines was made in 2006, with Prime Minister Tony Blair warning that it would be "unwise and dangerous" for Britain to give up its nuclear weapons. Although the Cold War had ended, the UK still needed nuclear weapons, said Blair, as no-one could be sure another nuclear threat would not emerge in the future. He outlined plans to spend up to £20bn on a new generation of submarines for Trident missiles. However, the Trident programme was to prove highly controversial, with costs escalating considerably. This was an especially divisive topic in the wake of the 2008 financial crash and the subsequent years of austerity and cuts to public services. By 2016, the Ministry of Defence had revised the cost of building, testing, and commissioning the replacement vessels to £31 billion (plus a contingency fund of £10 billion) over 35 years, about 6 per cent of defence spending every year. Nevertheless, MPs voted to back the renewal in a vote of 472 to 117. Alongside this programme, the government reiterated its commitment to multilateral nuclear disarmament, promising to reduce Britain's stockpile of nuclear warheads to 180 by the mid-2020s. Trident was based on the idea of Mutually Assured Destruction – it aimed to deter a nuclear attack on Britain by guaranteeing a retaliatory strike against any potential aggressor. After succeeding David Cameron as Prime Minister, one of Theresa May's first jobs was to write a "letter of last resort" authorising the use of Trident missiles in the event of such a nightmare scenario. Each of the Vanguard submarines contained a safe with one of these sealed letters, which of course it was hoped would never have to be used. David Cameron's previous letters were destroyed. Previously named Successor class, it was announced in October 2016 that the new submarines would be renamed as the Dreadnought class. After the parliamentary vote on the upgrade programme, construction began soon afterwards. The first new submarine would begin operation by 2028 and the existing fleet of Vanguard submarines is phased out by 2032, after more than 40 years of service. Each missile is 13 m (44 ft) long, weighing 58.5 tons (130,000 lb), with a range of 11,300 kilometres (7,000 mi), a top speed of 18,030 mph (29,020 km/h) (Mach 24) and target accuracy to within a few feet. Despite its diminishing power in the world, Britain retains a nuclear deterrent for long into the future and plays a key role in maintaining global security. Dreadnought-class submarines remain in service until the 2060s, by which time they are replaced by automated and crewless vessels. Transit of Mercury. A transit of Mercury takes place when the planet Mercury passes directly (transits) between the Sun and a superior planet, becoming visible against (and hence obscuring a small portion of) the solar disk. During a transit, Mercury appears as a tiny black dot moving across the disk of the Sun. A transit of Mercury occurs on 13th November 2032. The last time this happened was 11th November 2019. After 2032, the next occasion is in 2039, with a further eight transits during the remainder of the 21st century. Each transit typically lasts for several hours. Transits of Venus with respect to Earth can also occur – although these are rarer, since Venus is further from the Sun and orbits more slowly. Rarer still are simultaneous transits of both Mercury and Venus. Such an event last occurred in 373,173 BC and the next occurs on 26th July 69,163.

2033: Crewed mission to Phobos. In 2033, NASA conducts the first crewed mission to Phobos, the innermost and larger of the two natural satellites of Mars. This is the latest in a series of ambitious new targets to expand the human exploration of space. Previous missions had seen astronauts returning to the Moon and constructing a new space station in lunar orbit. This latest mission – over 60 years after the Apollo era – involves orbiting Mars first, before landing on the surface of Phobos. While the average distance to the Moon is 384,000 km (239,000 miles), a journey to Mars is nearly 600 times further, at 225 million km (139 million miles) and poses major technical and financial challenges. However, going to Phobos before Mars itself is less expensive and lower risk, while allowing NASA to demonstrate key technologies for the more advanced missions later on. When used in near-Earth space, the Multi-Purpose Crew Vehicle (MPCV) is capable of carrying four astronauts on 21-day missions. For journeys to more remote destinations requiring months of travel, such as Mars, it can be attached to other modules with longer-term consumables and support capabilities. Astronauts are able to manoeuvre around the low gravity environment of Phobos using jetpacks and other equipment, while collecting samples and performing a variety of science experiments. This "orbit first" mission serves as a precursor to landings on Mars itself in the late 2030s and early 2040s. Peak phosphorus is reached. Phosphorus is a basic building block of life, playing a vital role in the structural framework of DNA and RNA. Found in the cell membranes of animals and plants, it is essential for the transfer of energy. A main component of fertilisers, it helps plants to survive temperature changes, water changes and water deficiencies. This chemical is fundamental to the modern growing of crops. Phosphorus is a scarce and finite resource on Earth, and due to its non-gaseous environmental cycle, it cannot be replaced by anything else. For a long time, this problem was largely overlooked by governments – most of whom took a complacent attitude, assuming that this mineral would be around for centuries or more. It was rarely viewed as a political issue, with most talk about the chemical being focussed on its polluting effects, rather than its potential scarcity. New studies in the 2000s and early 2010s, however, revealed that supplies were dwindling much faster than had previously been thought. This trend was being accelerated by emerging economies such as China and India – countries in which there was ever-increasing demand for meat and dairy products, which correspondingly required more and more phosphorus to produce. By 2033, worldwide production of phosphorus has peaked. The immediate impact is an alarming increase in the price of food, as well as government nationalisation of phosphate reserves and the introduction of export tariffs. Some regions undergo famines, while others are forced to introduce emergency rationing. Food prices have also doubled due to climate change, adding further woe. Richer nations are better prepared for this crisis – but nevertheless, many have experienced a significant period of readjustment with new methods being mandated and deployed to capture, store, and recycle phosphorus. Among the most widely used short-term innovations is recycling of human urine (a phosphate-rich substance), although this is only a temporary solution. The extracting of phosphorus from the seabed is another development being looked at but presents major technological and financial challenges. With global population continuing to climb rapidly, the race is now on for longer-term fixes to provide genuine alternatives that can actually replace phosphorus. Hypersonic airliners are entering service. Following decades of research and development, a new generation of aeroplanes is entering commercial service. These aircraft have a cruising speed of Mach 5 – or about 3,800 mph – enabling them to fly from Europe to Australia in less than four hours. With a range of more than 20,000km (12,000 miles) they can perform this journey without refuelling and have excellent subsonic and supersonic fuel efficiency, thus avoiding the problems inherent in earlier supersonic aircraft. Another advantage is that, while the 150-metre-long designs are bigger than previous jets, they are actually lighter than Boeing 747s and can utilise conventional runways. They have moderate take-off noise, too. In many ways, they are the spiritual successor of Concorde. They lack windows, however. The heat generated by traveling so fast makes it difficult to install windows that are not too heavy. One solution to this problem has been the installation of flat screen displays, showing images of the scene outside. The final phase of Britain's HS2 rail link is completed. High Speed 1 (HS1), also known as the Channel Tunnel Rail Link (CTRL), was a 108 km (67 mi) high-speed rail line, running from London to the British end of the Channel Tunnel. Completed in 2007, this route into continental Europe had only a single operator at the time – Eurostar, which provided trains to Paris, Brussels, and seasonal destinations in southern France. Additional services became available in the 2010s, allowing direct high-speed rail from London to Frankfurt and Amsterdam. The development of high-speed rail sparked further interest and debate in Britain and was supported in principle by the three main political parties. Detailed plans were drawn up for a domestic network, linking together some of the nation's largest commuter cities. Though much controversy surrounded which cities should be served, as well as the environmental performance and impact, the plans were finally approved in January 2012. High Speed 2 (HS2) would connect London with the Midlands and the North of England. It would be developed by High Speed Two Ltd, a company established by the government. The planned route took the form of a "Y" shape, with a central trunk going from London to England's next largest city, Birmingham, which then forked into two spurs: one to Manchester and the other to Leeds. HS2 was built in stages, the London to Birmingham section being the first, with construction starting in 2016 and the first trains running by the mid-2020s. There would be no intermediate calling points: trains would travel directly between London and Birmingham at speeds of 400 km/h (250 mph), cutting the journey time from 1 hour 24 minutes to just 49 minutes. By 2033, the Manchester and Leeds branches are completed. Journey times from London to Manchester are reduced from 2 hours 8 minutes to 1 hour 20 minutes. Journeys from London to Leeds are reduced from 2 hours 20 minutes to 1 hour 20 minutes. Additional high-speed lines to Newcastle, Edinburgh and Glasgow are now being planned. The total cost of the project is over £32 bn ($49 bn), making it the UK's largest rail expansion in almost a century. Congestion is greatly relieved on other networks and there are significant economic benefits, with over a million new jobs created. As part of the plans, Euston station in London is fully redeveloped and there is also a connection running to Heathrow airport, one of the world's busiest aviation hubs. Lung disease in China has killed over 80 million by now. This has resulted from the combined long-term effects of (a) pollution; 20 of the 30 most polluted cities in the world are in China, (b) huge numbers of smokers; around 50% of adults, and (c) the widespread practice of burning wood or coal at home for cooking and heating; over 65% of the population. China has begun switching to cleaner fuels by this time, however, and is implementing a new programme of taxation, better health education and tobacco advertising bans. This begins to reduce the proportion of deaths from lung disease from around this time onwards. Five-year survival rates for kidney cancer are approaching 100%. In the early 2010s, there were around 209,000 new cases of kidney cancer diagnosed in the world each year, accounting for just under 2% of all cancers. The highest rates were recorded in North America and the lowest rates in Asian and African regions. Factors known to increase the risk of kidney cancer include smoking, which can double the risk of the disease; regular use of nonsteroidal anti-inflammatory drugs (NSAIDs) such as ibuprofen or naproxen; obesity; faulty genes; a family history of kidney cancer; having kidney disease that needs dialysis; being infected with hepatitis C; and previous treatment for testicular cancer or cervical cancer. Kidney cancer does not usually respond to chemotherapy or radiotherapy. If the cancer has not spread, it is normally removed by surgery. The increasing use of robots in hospitals has led to greatly improved accuracy and turnaround times. Together with new drug treatments, advances in cryotherapy (freezing the tumour away), radiofrequency ablation (burning the tumour away), gene therapy and other techniques, kidney cancer is gradually being defeated. In most of the developed world, five-year survival rates are now approaching 100%.

2034: Service robots’ number a billion worldwide. By the mid-2030s, the number of service robots has reached one billion worldwide and continues to grow rapidly. Service robots are generally divided into two separate groups: personal and professional. The former is used in non-commercial settings and usually by laypersons. Examples would include domestic units such as vacuum cleaners, lawn mowers, kitchen chefs, personal mobility assistants, toys, and pet exercise robots. The latter group, professional service robots, tend to be used for commercial tasks and are normally operated or monitored by properly trained personnel. Examples can include medical robots performing surgical operations, fire-fighting robots, automated security patrols, machines to clean public places, delivery robots and others designed to assist with retail or leisure environments. Both personal and professional service robots are separated from another category of machines: industrial robots. This third group is nowhere near as numerous or visible in everyday life, though still growing at a rapid rate during this time. The disruption caused by AI, automation and robotisation has accelerated in recent years, creating a backlash among the general public. Some of the countries most affected include Germany, Italy, Japan, South Korea, and Switzerland, where one-quarter of traditional work activities have been displaced. While some employees are able to transfer into other industries, substantial numbers are unable to do so. This is leading to calls for increased support and intervention by both governments and businesses, such as providing retraining and education for those affected. More radical initiatives, such as universal basic incomes (UBI), have also seen increased public support, with an ever-growing number of countries and regions willing to experiment with the idea. One of the sectors under most pressure in 2034 is retail, with machines having recently surpassed humans in the majority of tasks. Shop robots are now a common sight in large grocery, hardware, and other stores, where they roam the aisles and restock or rearrange items, perform security functions, and handle tasks such as cleaning floors. Unlike the "dumb" machines of earlier generations, robots of the 2030s are considerably smarter – highly adaptable to their surroundings and work situations, able to instantly recognise and interact with countless objects, while providing real-time information to customers. This has come about through exponential improvements in machine learning, cloud computing, bandwidth, sensor technology and so on. The increasingly sparse numbers of human staff – especially when combined with cashier less payment systems – can be a rather dehumanising and impersonal experience compared to traditional stores of the past. Senior citizens find it hard to accept the changes. However, the drive towards ever greater efficiency and productivity has made this trend unstoppable, with more and more businesses employing the use of robots. In some nations, such as Japan, China, South Korea, and Taiwan, they are becoming a vital necessity due to aging populations and shrinking workforces. NASA's Dragonfly spacecraft lands on Titan. Dragonfly is the fourth mission in NASA's New Frontiers program, chosen by the agency in June 2019. Launched to Saturn in 2026, it arrives on the surface of its large moon Titan in 2034. The probe, weighing approximately 450 kg (990 lb), lands by parachute in the equatorial "Shangri-La" dune fields, which are terrestrially similar to the linear dunes in Namibia in southern Africa and offer a diverse sampling location. NASA's mission planners analyse many years of earlier Cassini data to choose a calm weather period, along with a safe initial landing site and scientifically interesting targets. Dragonfly consists of a rotorcraft lander, much like a large quadcopter with double rotors – an octocopter. Its redundant configuration enables it to tolerate the loss of at least one rotor or motor. The craft performs vertical take-offs and landings (VTOL) and controlled flights between locations, powered by a radioisotope thermoelectric generator (RTG). It can travel at 36 km/h (22 mph) or about 10 m/s and rise to an altitude of 4 km (2.5 miles). The craft is designed to operate at temperatures averaging −179.2 °C (−290.5 °F). Taking advantage of Titan's dense atmosphere and low gravity (requiring 38 times less power than Earth-based flight), Dragonfly explores dozens of locations across the icy world, covering a total of 175 km (109 miles) over a three-year period. It samples and measures the compositions of its organic surface materials to characterise the habitability of Titan's environment and investigate the progression of prebiotic chemistry. The primary mission target is the huge Selk crater, produced by an impact large enough to have melted Titan's water-ice crust and liberate oxygen in the distant past. The craft remains on the ground during Titan's nights, which last 192 hours, or eight Earth days. Activities during the night include seismological studies and meteorological monitoring, sample analysis and local microscopic imaging using LED illuminators. It communicates directly to Earth – more than a billion kilometres away – using a high-gain antenna, with a transmission delay of 79 minutes. In addition to spectrometers, meteorological sensors and a seismometer, the scientific payload includes high-resolution panoramic cameras to image Titan's terrain and scout for scientifically interesting landing sites. The Laser Interferometer Space Antenna (LISA) is launched. The Laser Interferometer Space Antenna (LISA) is a gravitational wave observatory launched by the European Space Agency. This project is the third of three L-class (Large) missions in the "Cosmic Vision" programme which includes two other spacecraft – the Jupiter Icy Moon Explorer (JUICE) launched in 2022 and the Advanced Telescope for High Energy Astrophysics (ATHENA) deployed in 2028. LISA is designed to sense gravitational waves – tiny ripples in the fabric of space-time – with extreme precision. Three spacecraft are placed in a triangular formation with 2.5-million-kilometre sides, flying along an Earth-like heliocentric orbit. Laser interferometry is used to monitor fluctuations in the relative distances between them, with a resolution of just 20 picometres (20 trillionths of a metre, or smaller than a helium atom). To eliminate non-gravitational forces such as light pressure and solar wind on the test masses, each spacecraft is constructed as a zero-drag satellite and effectively "floats" around the masses, using capacitive sensing to determine their relative position, with ultra-precise thrusters to remain properly centred at all times. Previous searches for gravitational waves in space were conducted for short periods by planetary missions with other primary objectives (such as Cassini–Huygens), using microwave Doppler tracking to monitor fluctuations in the Earth-spacecraft distance. By contrast, LISA is a dedicated mission using laser interferometry to achieve a much higher sensitivity. Other antennas had been operational on Earth, but their sensitivity at low frequencies was limited by the largest practical arm lengths, seismic noise, and interference from nearby moving masses. Passing gravitational waves alternately squeeze and stretch objects by a tiny amount. These waves are caused by energetic events in the Universe, such as massive black holes merging at the centre of galaxies; black holes consuming small compact objects like neutron stars and white dwarfs; supernova star explosions; remnants from the very early phase of the Big Bang and possibly theoretical objects like cosmic strings and domain boundaries. Since LISA is the first dedicated, space-based gravitational wave detector, the mission adds a whole new sense to our perception of the Universe – enabling astronomers to "hear" events in ways not possible before and revealing many important phenomena that were previously invisible. Envision makes orbital insertion at Venus. Envision is a spacecraft developed by the European Space Agency (ESA) to study Venus. ESA selected Envision in 2021 as the fifth Medium-class mission for its Cosmic Vision plan. Launched in 2031, it takes 15 months to reach the planet and a further 16 months to achieve orbit circularisation via aerobraking. The probe is designed to help scientists understand the relationships between Venus' geological activity and the atmosphere, potentially finding clues as to why Venus and Earth took such different evolutionary paths. It determines the level and nature of current activity, reveals the sequence of geological events that generated its range of surface features and confirms whether Venus once had oceans or was hospitable for life. It provides new knowledge of the organising geodynamic framework that controls the release of internal heat from the planet. The probe's altitude ranges from 220 to 470 km and the mission has a duration of 4.5 years. Three instruments are included, plus a radio science experiment. The Venus Synthetic Aperture Radar (Ven SAR) enables several imaging and ranging techniques at spatial resolutions as fine as 10 metres. It characterises structural and geomorphic evidence of multi-scale processes that shaped the geologic history of Venus and reveals current volcanic, tectonic, and sedimentary activity. A second instrument, the Venus Subsurface Radar Sounder (SRS), penetrates the top kilometre of the subsurface, looking for underground layering and buried boundaries. This includes impact craters and their infilling, tesserae and their edges, lava flows and their edges, plains, and tectonic features, in order to uncover stratigraphic relationships at various depth ranges and horizontal scales. The third instrument, Venus Spectroscopy Suite (Ven Spec), provides compositional data on rock types, extremely high-resolution atmospheric measurements, detailed monitoring of sulphurous gases, and scanning of the mysterious UV absorber in the Venusian upper clouds. In addition to its three main instruments, a gravity and radio system maps Venus' gravity field in high resolution, allowing its deep internal structure to be probed and confirming the size and current state of the core. It also measures atmospheric properties through radio occultation. Envision is the latest in a series of probes to study Venus in recent years – the others being Russia's Venera D surface lander, as well as NASA's VERITAS and DAVINCI+ missions – greatly improving the scientific knowledge of this planet. Australia has fully decarbonised its electricity supply. By 2034, virtually all of Australia's electricity is supplied from renewable energy sources, such as solar, wind, biomass, and hydropower. For many years, Australia had struggled to break its addiction to fossil fuels. The coal industry, in particular, held enormous sway over the political system. PM Tony Abbott, describing coal as "good for humanity", insisted it would be the "world's main energy source for decades to come." He was followed by Malcolm Turnbull who, although promising to take climate change seriously, did little to alter the status quo. Turnbull was succeeded in 2018 by Scott Morrison, a fanatical lover of coal, who had once mocked the opposition by bringing a lump of coal into the debating chamber. Coal production in Australia increased substantially at the dawn of the 21st century – growing by 80% between 2000 and 2015. Australia was the biggest net exporter of coal in the mid-2010s, accounting for 32% of global exports (389 Mt out of 1,213 Mt total) and employing 50,000 people. The enormous Carmichael coal mine was proposed in the northeast of the country, a megaproject that was expected to produce 2.3 billion tonnes of coal over 60 years. This drew immense controversy about its claimed economic benefits, financial viability, plans for government subsidy and damaging environmental impacts. Alongside its potential harm to the nearby Great Barrier Reef (through dumping of dredge spoil), concerns were raised about groundwater pollution and the clearance of threatened species at the site, including koalas. Furthermore, the vast amount of carbon emissions, from this single mine alone, would be around 0.5% of the worldwide carbon budget limit for avoiding 2°C of warming. Nevertheless, the project continued to move forward, receiving billions of dollars in government assistance. Carmichael was just the first of a number of large mines proposed for the Galilee Basin and was intended to facilitate their development too. Australia had experienced record growth in coal exports, with its largest importers being Japan (34%), China (24%), South Korea (15%) and India (14%). Much of the remaining glut of coal was consumed domestically, accounting for nearly 50% of Australia's electricity supply in 2015. The country was among the highest of the developed nations in terms of carbon footprint per capita. However, the world was changing fast; in particular, with regards to energy production. Australia was not immune to the revolution now underway and faced losing billions of dollars in stranded assets. Early warning signs had provided a glimpse of what was to come – such as the auctions held in India, during which solar-generated electricity became cheaper than coal. This had followed earlier reports that international coal projects relying on new import markets faced major financial risks. Even when the costs of storage (to make intermittent power sources reliable) were added, renewable energy was becoming Australia's cheapest energy option. These changes had become more and more obvious in the late 2010s and continued into the 2020s. Between 2018 and 2019 the nation installed over 10,400MW of new renewable energy, helped by Australia's superb geographical placement and high level of solar irradiance. The accelerating trends in solar and wind power were accompanied by a corresponding decline in coal production, as mines and power stations proved to be economically unviable, either shutting down or being mothballed indefinitely. Renewables were even beginning to match the wholesale price of gas in Australia, meaning that solar-generated electricity could directly compete for provision of industrial heat. By 2025, renewables had overtaken fossil fuels when measured by terawatt-hours (TWh) and were supplying 50% of Australia's electricity. The large-scale exposure of investors to stranded assets was producing a "carbon bubble" with major economic repercussions – particularly for Australia, but also many other countries around the world. As the plunging value of fossil fuels became apparent, the bursting of this bubble created one of the largest economic crises of the first half of the 21st century. This setback was only temporary, however, as the world transitioned to a clean energy future. By 2034, renewables are supplying essentially all of Australia's electricity. Switzerland phases out nuclear energy. After the Fukushima disaster in Japan, questions were raised about the long-term viability of nuclear power. Switzerland was among the nations to abandon this form of energy production, following public protests and a government review. The country’s five existing reactors – supplying about 40% of the country's power – were allowed to continue operating but were not replaced at the end of their life span. The last plant is taken offline in 2034. Caribbean coral reefs are in danger of being wiped out. Often called "rainforests of the sea", coral reefs form some of the most diverse ecosystems on Earth. Historically, they have occupied less than 0.1% of the world's ocean surface – about half the area of France – yet provided a home for 25% of all marine species. Delivering a range of ecosystem services to tourism, fisheries and shoreline protection, the global economic value of coral reefs at one time was estimated at up to $375 billion each year. However, coral reefs are fragile ecosystems, partly because they are so sensitive to water temperature. In the early 21st century, they were under threat from climate change, oceanic acidification, blast fishing, cyanide fishing for aquarium fish, sunscreen use, overuse of reef resources, and harmful land-use practices, including urban and agricultural runoff and water pollution, harming reefs by encouraging excess algal growth. The Caribbean – home to 9% of the world’s coral – saw a 50% decline between 1970 and 2012, leaving just one-sixth of the pre-industrial reef cover. According to a detailed analysis in 2014, virtually all of the remaining Caribbean coral reefs would disappear within 20 years, based on current trends. Climate change had once been seen as the main culprit, lowering the pH level, and causing bleaching. While ocean acidification was still a serious threat, new data suggested that a loss of parrotfish and sea urchin – the area’s two main grazers – was, in fact, the biggest driver of coral decline in this particular region. For example, an order-of-magnitude increase in bulk shipping during the 1960s-70s introduced pathogens and invasive species near the Panama Canal that later spread to the Caribbean. An unidentified disease led to a mass mortality of the sea urchin in the 1980s, while extreme overfishing brought parrotfish to the brink of extinction in some regions. Loss of these species broke the delicate balance of coral ecosystems and enabled algae – on which they fed – to smother the reefs. Areas protected from overfishing, as well as other threats such as pollution, tourist activity and coastal development, were more resilient to pressures from climate change. Some of the healthiest coral reefs, with high populations of grazing parrotfish, included the Flower Garden Banks National Marine Sanctuary in the northern Gulf of Mexico, Bermuda, and Bonaire, all of which banned or restricted fishing practices that harmed the fish. Reefs where the parrotfish were not protected suffered tragic declines – such as Jamaica, the entire Florida Reef Tract from Miami to Key West, and the U.S. Virgin Islands. Attempts were made in subsequent decades to protect these species across a wider area and restore the balance between algae and coral using better management strategies. Although some of these efforts achieved modest success, short-term economic pressures and business interests tended to outweigh these concerns. The region as a whole remained under serious threat, and by 2034, Caribbean coral reefs have edged further towards complete collapse. Coastal erosion has destroyed hundreds of UK homes. Rising sea levels and increased storm intensity have begun to seriously affect the British coastline. By the mid-2030s, more than 800 homes have been lost due to erosion. While mitigation efforts have been stepped up around the country as a whole, these particular homes were deemed too expensive to save, resulting in their occupants being forced to abandon them and settle elsewhere, with little or no government compensation. During especially stormy years, up to 7 metres (23 ft) of land is being eaten away per year in some places – the highest rate in Europe. The most at-risk areas include Devon, Cornwall, the Isle of Wight, Yorkshire, and East Anglia. As well as buildings, the shrinking coastline has affected farmland, nature reserves and nuclear power plants, along with a nationwide public footway established in the previous decade. Towards the end of this century, the number of homes being lost will increase more than eight-fold to 7,000. If no action were taken, the figure would grow 90 times higher, from 800 to 74,000. A major supermoon occurs. A so-called "supermoon" occurs when a full moon coincides with a lunar perigee – the closest approach of the Moon in its elliptical orbit around Earth. Such events produce the largest apparent size of the lunar disk as seen from Earth, making it appear 15% larger and 30% brighter than during apogee (its most distant point from Earth). This provides great opportunities for astronomers and photographers. Particularly dramatic supermoons occur when a full moon and lunar perigee also happen when Earth is at perihelion (its closest point to the Sun for the year). On the morning of 14th November 2016, the distance between the centre of the Moon and Earth was 221,524 miles (356,509 km), the closest they had been together during a full moon since 1948. The Moon would not appear this large again until 25th November 2034. The closest supermoon of the century occurs on 6th December 2052. Notable supermoons are also observed in 2070, 2088 and 2098.

2035: Millennials are enjoying an inheritance boom. In earlier decades, the Baby Boomers (defined as those born between 1946 and 1964) were vilified by younger generations. In the post-War period they had inherited a host of societal benefits – such as lower education costs, cheaper healthcare, affordable homes, wage increases that matched productivity gains and so on. During the late 20th century, however, as the Boomers became the dominant part of the electorate, their voting patterns supported and gave rise to the neoliberal economics of Reagan, Thatcher, et al. Policies introduced by these and subsequent governments began to strip away and reverse much of what the Boomers had once enjoyed, passing the costs on to Generation X and the incoming Millennials. Following the economic crash of 2008 and ongoing stagnation across much of the West, it became clear that these younger generations would be faced with enormous debts, reduced social mobility, an environmental crisis and other problems. The very real prospect of being saddled with lower incomes and poorer living standards than their parents led Millennials, in particular, to accuse the Boomers of a selfish individualism and short-termism. However, the Boomers would not be around forever. By 2020, their influence was diminishing as many began to enter their final years. In addition to waning electoral power, the Boomers were passing on record sums of wealth to their offspring. In some countries, inheritance figures more than doubled between the 2010s and 2030s, reaching a peak by 2035. This would help in offsetting the decline of living standards that the younger generations had experienced earlier. However, it leads to greater inequality within the Millennial cohort (not all of whom had parents with large sums of money or property), while for many it arrives too late to be used during their expensive child-rearing years that require a larger home. Lion populations in Africa have declined by half. Between 1990 and 2015, lion populations throughout many parts of Africa declined sharply. The reductions were especially alarming in West and Central Africa. In two national parks, the Mole and Comoé – located in Ghana and the Ivory Coast, respectively – the animals were found to be extinct. The main threats to lions were the spread of subsistence farming into woodlands, open plains, and thick bush where lions hunted and bred. Being in close proximity to human settlements meant they were often killed in retaliation for attacks on livestock or humans. Alongside this, a thriving trade in bush meat was depleting the prey that lions depended on for survival. Trophy hunting was another problem, one notable example being the death of Cecil the lion, a major attraction at Hwange National Park in Zimbabwe, who was killed by a recreational big-game hunter. Having once been found in south-eastern Europe and throughout much of the Middle East and India, lions had lost 85% of their historic range by 2015. Conservation efforts were impeded due to most African nations lacking the money and resources that were needed – so inevitably, lion populations underwent further declines. By 2035, their numbers have halved again, with about 10,000 surviving mostly in southern parts of the continent, which has better wildlife protection measures and a lower density of humans. World tin reserves are running out. Tin is a silvery-white metal that is soft, ductile and malleable. Among the oldest metals known to mankind, it was discovered around 3000 BC during the bronze age, which is in fact named for an alloy of tin and copper. Its role in casting as an alloy created a valuable trade network that linked ancient civilisation for thousands of years. Tin does not occur naturally in pure form, so it must be extracted from other ores. Because of tin dioxide's high specific gravity, tin is often mined downstream of a primary deposit – along riverbanks, in valleys, or at the bottom of the ocean. Therefore, the most economical extraction methods are dredging, open-pit and hydraulic mining. Historically, the largest producers of tin have been China, Indonesia, Malaysia, Peru, Brazil, and Bolivia. Tin is primarily used in soldering, metal plating, a wide range of alloys, superconducting magnets, and PVC plastics. As China and other emerging nations continue to demand resources beyond what the Earth can provide, tin is among the metals now in critical decline. By the mid-2030s, most of the large economically recoverable deposits have been completely exhausted. Local, individual, and small-scale mines – not reporting their reserves in the manner of large mining corporations – have continued to supply the markets. Recent new discoveries in Columbia have also provided some temporary relief to demand. However, an adequate long-term solution can only be found with a complete replacement for tin. Recycling has increased sharply as the market trends away from mineral sources. Distributed propulsion systems are revolutionising air travel. During this decade, a number of national militaries and commercial aerospace firms are adopting turbo-electric distributed propulsion systems for their aircraft, replacing the more traditional wing-attached engines. This is a result of recent advances in materials science, cryogenic cooling systems, novel fuels, high fidelity computational fluid dynamics (CFD) and experimental tools. Along with hypersonic engines, this technology is contributing to an ongoing revolution in aircraft design. The basic concept of distributed propulsion is that the thrust-generating components of an aircraft are now fully integrated into the airframe of the vehicle. Instead of one or two large singular engines attached to the outside of the wing or fuselage, thrust is generated by a spanwise distribution of smaller engines or fans across the width of the wing. These are also more seamlessly merged into the body of the plane, offering major advantages in terms of aerodynamics, and thrust. This is usually combined with a blended wing body design, creating a more streamlined, synergistic combination of all aircraft components. Airflow around the plane is optimised – allowing for steeper climbs during take-off, greater degrees of control and manoeuvrability, higher bypass ratios and much greater fuel efficiency. In addition, the majority of these systems utilise electrical propulsion. Advances in energy storage, as well as a new generation of ultra-lightweight superconductors, have finally paved the way for large-scale production of electric aircraft. These have the benefits of lighter weight, less maintenance, a noise reduction of up to 70 decibels and lower carbon footprints. Construction of these planes is also considerably cheaper in many cases. Self-driving vehicles dominate the roads. Accelerating breakthroughs in the fields of artificial intelligence, sensors and telecommunications have led to a new generation of self-driving cars. These vehicles are considerably safer and more reliable than previous models and now dominate the mainstream markets, particularly in developed nations. Today, annual purchases of autonomous vehicles are nearing 100 million worldwide, representing almost 75% of all light-duty vehicle sales. This compares with 60 million total light-duty vehicle sales in 2012 and is largely due to soaring populations and the rapid industrialisation of many countries. Simpler versions of this technology were seen in the 2010s in the form of emergency braking systems, connected vehicle networks, self-parking, and freeway cruising features. Now though, computing power and stronger AI mean that today's autonomous vehicles can outperform even the best human drivers. A combination of GPS, on-board sensors, traction and stability control, and adaptive cruise control allow a car to sense incoming objects from all directions, detect incoming crashes and impacts, predict the movements of other vehicles on the road, and adapt to changing road and weather conditions. Real-time updates are constantly received by the car's on-board computer, giving up-to-date information on traffic, allowing the vehicles to determine the optimal route to their intended destination. A number of hurdles had to be overcome in order to reach this point. One was the reluctance of automakers to take on responsibility for both the construction and operation of their vehicles. Another was the disruption autonomous vehicles posed to the insurance industry. Shifting responsibility from driver to manufacturer added a whole series of complications to the legal and financial proceedings of potential accidents. Indeed, the early adoption period of self-driving cars was marked by a number of high-profile lawsuits and court hearings, often hyped up by media outlets. Alongside this were the ethical implications of putting the lives of passengers and pedestrians into the hands of a machine. Despite these problems, the rapidly improving performance and inherent safety of these vehicles succeeded in boosting demand substantially. The efficiency offered by self-driving cars also helps to cut down on congestion and pollution. As well as improving road safety, most of these cars are now electric, or hybrid electric, reducing their CO2 impact. These and other factors mean that by the middle of this century, the vast majority of cars on the road will be fully autonomous. Robots are dominating the battlefield. Highly mobile, autonomous fighting machines are appearing on the battlefield now. Guided by AI, they can aim with inhuman precision and come equipped with powerful sensors, GPS, and thermal vision. They can be deployed for weeks or months at a time, if necessary, without need for rest or maintenance. They have other advantages too – such as a complete lack of remorse or fear, no need for training or retirement payments or other such costs. However, debates are raging over the morality and ethics of these weapons systems. Norway's underwater suspended tunnels are completed. A major feat of engineering is completed in Norway this year as the final in a series of submerged floating tunnels (SFT) is opened. This novel concept consists of parallel tubes measuring 1,200 m (4,000 ft) in length, each carrying two lanes of traffic across the Sognefjord – the largest and best-known fjord in Norway and second longest in the world. Supported by their own buoyancy, the structures exploit the physics of hydrostatic thrust, or the Archimedes' principle. They are suspended at depths of 65 to 100 ft, below any possible contact with ships and withstanding any tidal movements or adverse weather effects. The initial phase of this project became the first underwater suspended tunnel to be operational anywhere in the world. In subsequent years, it was joined by several others in nearby regions, together costing a total of $25 billion. This project is designed to ease the congestion of local ferry services and to slash travel times between the north and south of the country. For example, a car journey of 21 hours from Kristiansand to Trondheim is reduced by more than half. Most of the vehicles on the nation's roads are self-driving by now, which helps to further improve road travel times. Being underwater and out of sight (as opposed to highly visible bridges over land) also means the scenic landscapes of each region can be preserved. The global airline fleet has doubled. By 2035, the number of commercial airplanes in service has doubled compared to 2015 – going from 22,500 to over 45,000 with a total worldwide value of nearly $6 trillion. Most of this growth has come from smaller, single-aisle planes, stimulating demand for low-cost carriers, and providing replacements for older, less-efficient planes. Perhaps unsurprisingly, airlines in the Asia Pacific region comprise the largest share of new orders (38%), followed by North America (21%), Europe (19%) and the Middle East (8%). The customer base for airplanes has become increasingly diverse and globalised, thanks to emerging markets and new business models. In 1995, airlines in Europe and North America represented 64% of all traffic. By 2035, that share has fallen to 37%, with Asia Pacific and Middle Eastern airlines becoming far more prominent in global aviation. New supersonic planes, utilising "quiet" engine technology to dampen sonic booms, have expanded the available long-haul routes, reducing journey times, and making the world feel smaller and more interconnected than ever before. However, all regions still face the challenges of fuel-price volatility, emission controls, and ever-increasing airport and airspace congestion. Biofuels and other clean technologies are used for many more planes than in earlier decades. Even solar power is now being adopted for short to medium haul flights alongside new hydrogen planes, although these still represent a minority of aircraft. With air traffic growth still outpacing efforts to reduce pollution, the aviation sector has become an increasingly significant contributor to global greenhouse gas emissions.

2036: Hepatitis C has become a rare disease in the U.S. Hepatitis C is an infectious disease affecting primarily the liver, caused by the hepatitis C virus (HCV). The infection is often asymptomatic, but chronic infection can lead to scarring of the liver and ultimately to cirrhosis, which is generally apparent after many years. In some cases, those with cirrhosis will go on to develop liver failure, liver cancer, or life-threatening oesophageal and gastrointestinal damage. HCV is spread by blood-to-blood contact from intravenous drug use, poorly sterilised medical equipment, transfusions, body modification (such as tattoos or piercings) and high-risk sexual activity. The existence of hepatitis C (originally identified only as a type of non-A non-B hepatitis) was first suggested in the 1970s and proven in 1989. By the early years of the 21st century, an estimated 150–200 million people globally were infected. Those who developed cirrhosis or liver cancer required a liver transplant in some cases. However, in many regions of the world, people were unable to afford treatment as they either lacked insurance coverage or the insurance they had would not pay. In the United States, the average lifetime cost of HCV was estimated at over $33,000, with a liver transplant costing approximately $200,000; and more than 15,000 deaths were attributed to the disease each year. Standard approved treatments were able to cure 50–80% of patients. However, a new generation of medicine was emerging, based on nanoparticles and direct-acting antivirals – able to target specific virus enzymes. This was alongside trends showing improvements in access to treatment, and more aggressive screening guidelines. By the 2010s, over a hundred new medications were being researched and developed. By 2036, only 1 in every 1,500 people in the U.S. are infected with HCV. In-vitro meat is a mature industry. Recent advances in tissue engineering have made it possible to "grow" synthetic meat, using single animal cells. This first became affordable to the public in the 2020s. After years of further testing and refinement, a wide range of different meat products are now available, in what has become a rapidly expanding market. In-vitro meat has a number of advantages. Being just a lump of cultivated cells, it is produced without harm or cruelty to animals. It is unusually pure and healthy whilst retaining the original flavour, texture, and appearance of real meat. Perhaps most importantly, it requires far less water and energy to produce, greatly lessening the impact on the environment. Like GM crops, there were political and psychological hurdles that delayed its adoption in some countries. However, rising food prices caused by population growth and ecological impacts, together with endorsements from animal welfare groups, later gave impetus to its development. Though still years away from completely replacing traditional meat, it is now a mainstream product in most countries around the world. Alzheimer's disease is fully curable. Treatments for Alzheimer's developed in the 2020s reduced the risk of acquiring the disease by more than half. Now, thanks to pioneering efforts, a further decade of progress is yielding effective cures. Drawing from a myriad of long-term studies, researchers have identified the precise mechanisms and processes involved in the loss of neurons and synapses in the cerebral cortex and subcortical regions. Faulty genes can be "switched off" with a new generation of drugs, while the brain itself is regenerated using stem cells. This breakthrough was aided in part by reverse-engineering of the human brain, which provided researchers with a complete model of its neurological system down to the cellular level. Nano-scale robotics are now increasingly common in medical procedures, and these can precisely target individual cells. The ability to combat Alzheimer's is one of the great success stories of the 2030s. It comes at a time when dementia rates are soaring, with cases predicted to quadruple in the four decades from 2010 to 2050. Detailed probing and mapping of the Kuiper Belt is underway. Advances in telescopic power have continued to reveal new bodies in the Kuiper Belt, some rivalling Pluto in size. At the same time, a new generation of solar-sail technology is emerging. Spacecraft using this form of propulsion were first demonstrated in 2010. Much larger versions are being deployed now with membranes extending hundreds of metres, offering greatly improved thrust-to-mass ratio – up to 50 times higher than in previous designs. This is made possible through nanotechnology and space-based production of sail panels. Following in the footsteps of New Horizons, a whole series of these probes is now being sent to the Kuiper Belt, which until now was only sparsely explored. Close range studies are conducted on ancient, icy planetoids of this remote region. With better telescopes and longer-range probes, humanity is penetrating ever further into the depths of space. Astronomers are now forming a highly detailed and extremely accurate map of our Solar System as a whole. Lemurs are on the brink of extinction. After years of decline, the vast majority of the world's 103 species of lemur are facing extinction. This has been the result of decades of sustained deforestation, mining, hunting, and slash-and-burn farming in Madagascar – their only natural habitat on the planet. By now, very little of the island's original forest cover remains. This has forced lemurs and countless other species into increasingly small and isolated patches of liveable habitat. In earlier decades, a number of efforts were undertaken which attempted to preserve the remaining populations. The effectiveness of these projects was severely limited by the social and political climate of Madagascar. Government corruption and the impotence of law enforcement meant that any restrictions on deforestation and poaching were poorly enforced or outright ignored. The extreme poverty of the nation also forced many inhabitants to turn to the forests to illegally cut wood or dig for gold in order to support themselves. Many hunted lemurs for food as well. Today, the majority of remaining individuals can be found only in zoos and private collections. Lemurs are now joining the ranks of the radiated tortoise and many other species disappearing from Madagascar. By the middle of the 21st century the island will have experienced one of the most dramatic mass extinctions in human history. This will occur alongside many similar events throughout the natural world.

2037: Total solar eclipse in Australia and New Zealand. A total solar eclipse occurs on 13th July 2037. It passes through the centre of Australia at 2:40 UTC (12:40 local time) with maximum eclipse occurring near the intersection of three states – Queensland, the Northern Territory and South Australia – before moving across the North Island of New Zealand. Totality has a duration of three minutes and 58 seconds. The U.S. Air Force introduces a new stealth bomber. By 2037, the number of bombers in the US Air Force has dropped below the minimum requirement of 170. This is due to a combination of attrition, changes in operating procedures and decommissioning of older aircraft. The B-52 has now been in service for 85 years – an unprecedented length of time for a military vehicle. The last of these planes will finally be retired soon. A next generation bomber, intended to serve as a stopgap until more advanced designs were available, was introduced from 2018. This is now being replaced by a new military aircraft known by the codename of "2037 Bomber”.  The new bomber is the most advanced aircraft to ever fly. It has unparalleled stealth capabilities, a range that enables it to strike targets almost anywhere in the world, and a payload which includes nuclear capability. It is "manned optional", with most missions being remote-controlled, or even entirely automated. It is used in a number of resource wars during this time – giving the US an impressive tactical edge on the battlefield. America's sixth-generation fighter jet enters service. By 2037, deliveries of the F-35 Lightning II for the U.S. military have ceased.   Although the aircraft is scheduled to remain in service until 2070, it is succeeded around this time by a sixth generation of planes that begins to be rolled out. The U.S. Navy's existing fleet of F/A-18 Super Hornets is also being retired now, necessitating a replacement. The new fighter jets are procured for both the U.S. Navy (a program known as F/A-XX) and Air Force (known as F-X). In terms of technology, they are a major leap over the F-35 and also designed to outclass China's Chengdu J-20 and Shenyang J-31. The sixth-generation jets feature increased autonomy (with the option of being unmanned), orders of magnitude improvements in computer processing and algorithmic power, faster manoeuvring and sensing of the battlespace, hypersonic weapons, laser guns, advanced electronic warfare capabilities, better stealth technology and so-called "smart skins" where sensors are built into the side of the aircraft itself to reduce drag. They incorporate a supersonic tailless design for the first time ever, made possible through advanced computer modelling and new materials.

2038: Older computers are at risk of experiencing major software malfunctions. The Year 2038 problem (also known as "Y2K38" by analogy to the Y2K Millennium bug) gains considerable public and media attention this year. It affects programs written in the C programming language. These were relatively immune to the earlier Y2K problem, but suffer instead from the Year 2038 problem. They use a library of routines called the standard time library. This takes a stored, 32-bit integer and interprets the current value as the number of seconds which have passed since 00:00:00 UTC on Thursday, 1st January 1970. Because of the limited number of possible values that can be derived from this 32-bit integer, the farthest time that can be represented in this way is 03:14:07 UTC on Tuesday, 19th January 2038. Any times beyond this point will "wrap around" and be stored internally as a negative number, which these systems interpret as a date from 1901, rather than 2038. This is called integer overflow. For older computers that still use this system, major problems begin to arise with file systems and databases, due to erroneous calculations. Fortunately, most systems have been upgraded by now, and little overall damage is done. NASA's Trident spacecraft arrives at Neptune. Trident is a mission to the outer planets, proposed in 2019 as part of NASA's Discovery program. Launched in 2026, the probe makes gravity-assist flybys of Venus (2027), Earth (2028 and 2031), Jupiter and its moon Io (2032), before reaching Neptune in 2038, with a focus on its largest moon, Triton. Voyager 2, the last mission to Triton, flew past in 1989 at a distance of 40,000 km (25,000 mi). It found evidence of geological activity, with a young surface and relatively few impact craters, several cryovolcanoes and a very thin atmosphere. Despite evaporation taking place during "summer" periods, the moon had the coldest average temperature in the Solar System at just 35.6 K (−237.6°C), only 35°C higher than absolute zero. After this fleeting glimpse by Voyager 2, the icy body would remain unvisited for another 50 years, leaving many questions unanswered. Trident would aim to solve these scientific mysteries, while obtaining a near-complete map of the surface (Voyager 2 only mapped 40%) and providing high-resolution images. The spacecraft performs a sole flyby, passing through Triton's thin atmosphere at 500 km (310 mi) altitude, sampling its ionosphere with a plasma spectrometer and performing magnetic induction measurements to assess the potential existence of an internal ocean. Its payload also includes an infrared spectrometer, narrow angle camera, and wide-angle camera. In addition to studying Triton, the probe reveals new insights into Neptune – and therefore advances the understanding of "ice giants", which are the most common type of exoplanet. A strong incentive to launch the mission is a rare and efficient alignment between Jupiter, Neptune, and Triton in 2038. Additionally, Neptune's orbit around the Solar System during this time offers the best opportunity to study Triton's ice plumes – missing this window means waiting for another century. The New Horizons probe is 100 AU from the Sun. The New Horizons probe was launched in 2006, arriving at Pluto in 2015. After surveying the dwarf planet and its five moons, the spacecraft headed towards the Kuiper Belt where it studied a small, icy body. This phase of the mission lasted until 2022, after which New Horizons began journeying to a boundary known as the outer heliosphere, where the solar wind meets the local interstellar medium. By 2038, it has reached 100 astronomical units (AU) from the Sun – equivalent to 100 times the distance between the Earth and Sun – and is moving in the direction of the constellation Sagittarius that includes the supermassive black hole at the centre of our galaxy. Even though it was launched much faster than any outward probe before it, New Horizons will never overtake Voyager 1 or Voyager 2 as the most distant human-made object. Close flybys of Saturn and Titan gave Voyager 1 an advantage with its gravity assist. When New Horizons reaches 100 AU, it is travelling at 13 km/s (29,000 mph), around 4 km/s (8,900 mph) slower than Voyager 1 at that distance. Capital punishment has greatly declined in use. At the start of the 20th century, capital punishment was used in almost every part of the globe – including the most developed nations. In the latter decades of the century, however, many countries abolished it. The last guillotining in France was conducted in 1977, while in the UK, the death sentence for treason was ended in 1998. Americans' views on the issue varied significantly, but the number of executions performed in the US showed a consistent long-term decline. Peaking in the mid-1930s, they fell dramatically thereafter, briefly rising in the 1990s before dropping again in the 21st century. Between 2000 and 2010, executions in the US plummeted by over half. It had also become far cheaper to imprison people for life. By 2010, nearly 50% of countries had outlawed the death penalty for all crimes. This reflected concerns over the possibility of executing the innocent, as well as the morality of such brutal punishments in modern civilised society. Public opinion continued to shift in favour of bans – a trend fuelled by growing access to information brought by the Internet, media, and technology in general. This included mistakes revealed by DNA evidence, for example, as well as high-profile media investigations, and the work of international human rights organisations such as Amnesty. Another factor sustaining this trend was the growing urbanisation and democratisation of the planet, with cities tending to favour more liberal and progressive policies than smaller, traditional rural communities. Yet another factor was the ongoing influence of feminism in society, with women tending to oppose it more than men. However, capital punishment remained entrenched in some regions: notably China, which held more executions than the rest of the world combined, killing thousands of its citizens every year. Iran, North Korea, Saudi Arabia, and Yemen were also notorious for their executions, sometimes carried out for highly dubious reasons (e.g., sorcery). By the late 2030s, virtually all nations in the developed world have abolished the death penalty – while a minority of repressive and pariah states continue to practice it. Though its prevalence has fallen in Muslim society as a whole, the global abolition of capital punishment remains elusive, for now. Re-emergence of Brood X cicadas. Cicadas are medium-sized insects with prominent eyes set wide apart, short antennae, and membranous front wings. They are widespread, with more than 3,390 extant species found on all continents except Antarctica. Cicadas are known for their extremely loud song. Thopha saccata, for example – also known as the Double Drummer – has the loudest sound of any insect and can reach in excess of 120 dB if large numbers are in close range. The insects typically live-in trees, feeding on watery sap from xylem tissue and laying their eggs in bark. They fall into two main categories: annual cicadas, spotted every year, and periodical cicadas, which spend most of their lives underground and only emerge once every decade or two. The largest and most concentrated group of periodical cicadas, called Brood X, appears every 17 years. Having previously emerged during 2004 and then again in May 2021, it returns in 2038. As before, this becomes the world's largest insect swarm, covering much of the eastern United States with as many as 1.5 million insects per acre. The states affected are New York, New Jersey, Pennsylvania, Delaware, Maryland, District of Columbia, Virginia, West Virginia, North Carolina, Georgia, Tennessee, Kentucky, Ohio, Indiana, Illinois, and Michigan. During this period, which can last for several weeks, many regions have to endure the shrill noise of male cicadas attracting female mates. Although cicadas are not life-threatening to humans, residents are advised to use noise-cancelling earphones or earplugs to prevent ear damage, while avoiding garden activities like planting and pruning. Pets must be kept from consuming the insects, in case of digestive stress. Swimming pools must be covered to avoid damage to filters, while the use of outdoor power tools is discouraged due to the insects being attracted by the sounds. The insects' full life cycle – from emergence above ground, to their mating and subsequent death – typically lasts for six to ten weeks. The next generation of Brood X emerges in the year 2055. The FIFA World Cup trophy is replaced. The current World Cup trophy has been in use since the 1974 World Cup. There is only space for 17 countries to be engraved on its base. In 2038, the final name plaque is filled in, and a replacement cup is commissioned with a new design. Like its predecessor, this is made of 5 kg (11 lb) of 18 carat gold. Coal power has been phased out by Germany. During the 20th century, Germany obtained its electricity predominantly from fossil fuels (particularly coal) and then later also nuclear power. As Europe's largest consumer of electricity, its carbon emissions were substantial, ranking sixth in the world. At the dawn of the 21st century, however, a radical change began to occur as its supply shifted to new, less polluting forms of energy. In 2010, the German government published Energiewende ("energy transition") a key policy document outlining targets for increasing the share of renewables in power consumption, which included greenhouse gas (GHG) emission reductions of 80–95% by 2050 (relative to 1990). Following the Japanese Fukushima disaster of 2011, the government removed the use of nuclear power as a bridging technology and the decision was taken further to phase out nuclear altogether by 2022. This move triggered a brief rise in coal, to make up the shortfall. However, renewables were expanding rapidly, with solar and wind forming an ever-larger share of electric generating capacity. In 2019, a government-appointed coal commission introduced a proposed pathway to phase out all coal power within two decades. By the end of the 2010s, Germany had about 40 GW of installed coal power capacity, with 21 GW fired by bituminous coal – referred to as "hard coal" by Germany's Federal Network Agency – and 19 GW by lignite, or "brown coal". A bituminous coal plant, Dattaln 4, entered service in mid-2020, adding 1.1 GW at a cost of €1.5 billion ($1.6 billion), becoming Germany's last ever coal plant to be newly connected to the grid. Between 2020 and 2026, hard coal capacity reductions – implemented using auctions organised by the Federal Network Agency – saw many plants voluntarily taken offline in the north, west and south of the country. As renewables continued to surge in capacity, forced closures occurred in the latter part of the decade and into the 2030s. Of the 84 sites operational during 2019 (when the strategy was first announced), just a handful now remained. The phase out had commenced in western Germany, to soften the impact in the economically poorer eastern side of the country. With bituminous coal disappearing from the energy mix, attention now turned to the softer lignite, concentrated mostly in the east. By 2038, the final plant shutdown has occurred. A compensation plan of some €40 billion ($43.7 billion) is further helping the transition, which includes adaptation payments for older workers in lignite mines, hard coal and lignite power plants who lost their jobs due to the coal exit plans. This provides a maximum total of €5 billion ($5.5 billion) by 2048. The space industry exceeds $1tn worldwide. By the late 2030s, the worldwide space industry exceeds $1 trillion in size. This represents a quadrupling compared to 2010 and a tenfold increase since the start of the 21st century. The rapid growth in this sector has been fuelled, in large part, by explosive demand for high-speed Internet services, including the recent establishment of a global, quantum-encrypted satellite network for ultra-secure communications. However, a number of other areas have started to boom now, including space tourism (via space planes and very high-altitude balloons) and resource extraction from near-Earth asteroids. The latter, while still accounting for only a tiny fraction of global commodities, is now considered a relatively routine activity as the industry is sufficiently mature in technology. In 1997, private investment in space overtook government spending for the first time. This trend continued into the 21st century, with space becoming more and more commercialised. Access to space was being made cheaper and easier by a new generation of launch vehicles. These and other technological innovations were enabling even small companies to compete and do what only big government agencies had done in the past. Some of the most famous entrepreneurs to emerge during this time included Elon Musk, Jeff Bezos, Peter Diamandis, Robert Bigelow, and Richard Branson; but many less well-known individuals and groups were also taking advantage of the changing industry landscape. Crowdfunding, for example, meant that even casual Internet users could now have a stake in the development of space projects – like the new class of "CubeSats". These tiny spacecrafts could hitch a ride and piggyback alongside the larger and more expensive missions. While private commercial enterprises are the dominant driving force, government space agencies still have a role to play. NASA has been developing its manned Mars program and this decade sees its astronauts walking on the Red Planet for the first time. In addition, a number of huge telescopes are being built that dwarf any previous observatory. China, Europe, India, Japan, Russia, and other agencies have also made progress in human exploration, with international collaborations to establish the first bases on the Moon. Meanwhile, new countries are now appearing on the scene and establishing their own national space agencies, as their economies become advanced and rich enough to do so. The number of countries with independent launch capability has also continued to increase.

2039: The world's first trillionaire. The gap between the richest and poorest has now reached astronomical proportions. By the late 2030s, a well-known American business magnate has achieved a net worth exceeding $1 trillion. This is 12 times the highest figure reported at the turn of the century and is equivalent to the entire GDP of Mexico in 2015. A major growth area in terms of innovation and wealth creation is the exploitation of space resources, such as metals and minerals from the Moon and asteroids, which is now emerging as the next big thing in commercial space ventures. By the 2070s, there are more than ten trillionaires in the world. Manufacturing jobs have largely disappeared in the West. In the United States and most other developed nations, manufacturing has gone the same way as agriculture – vitally important yet employing very few people. Robots, automation, and 3D printing, now sufficiently perfected after decades of development, have taken over a wide range of roles once performed by humans. As China and other emerging nations make the transition to service-based economies, they too will experience this trend in the not-too-distant future. Australia's national symbol, the koala, faces extinction. By this date, the koala population in Australia has dwindled to almost nothing, due to the combined impacts of drought, disease, climate change and loss of natural habitat. Only those in captivity now remain. Five-year survival rates for leukaemia are approaching 100%. Leukaemia is a cancer of the blood or bone marrow, characterised by an abnormal increase in the number of immature white blood cells, called "blasts". In 2008, there were 350,000 new cases worldwide, and 257,000 deaths from the disease, placing it within the top 10 most common type of cancer deaths. However, treatments improved greatly over the decades, with survival rates showing a consistent upward trend. Gene therapy was among the most successful new approaches. One such method turned a patient's own T-cells into cancer-targeting attackers. In one study, conducted between 2010 and 2011, two of three patients remained cancer-free after a year. By 2039, five-year survival rates in many countries are reaching 100%. Extreme heatwaves are commonplace in the U.S. The previous five decades were all the hottest on record – each surpassing the last. Extreme heatwaves are now having a serious impact on agricultural yields and human health. This is a particular problem in the American West. From 2030 to 2039, most areas of Utah, Colorado, Arizona, and New Mexico have at least seven summers equal to the hottest season ever recorded between 1951 and 1999. The hottest daily temperatures of the year from 1980 to 1999 have become twice as frequent. There are persistent, drier conditions around the country, with substantial reductions in soil moisture and an accompanying rise in forest fires.

2040: India's economy is rivalling that of China and the U.S. By 2040, rapid economic growth has enabled India to catch up with China and the U.S. These three countries – the G3 – now have by far the largest share of world GDP. India has benefitted from a number of cultural, economic, and demographic trends in recent decades. This includes a youthful, growing and economically productive population, now the world's largest at 1.6 billion, with an average age of just 34. By comparison, China's average age is 46 and its working population has been declining. Expansion and development of India's service sector – adding greatly to the global knowledge-based economy – has occurred in parallel with a slowdown in China's growth rate as its economy matures. India has also managed to avoid many of the disruptive challenges experienced by China, since its market-based economy is already part of a liberal democracy, unlike the planned economy of its rival. India's currency, the rupee, is now challenging the renminbi as the world’s strongest. Due to its global influence and military capabilities, India has also gained a permanent seat on the UN Security Council. Later this decade, on 15th August 2047, the nation will celebrate its 100th anniversary as an independent state. However, climate change and other ecological impacts are converging with increasing speed. In particular, neighbouring Bangladesh requires ever more financial support and humanitarian aid, as flooding worsens. Widespread automation and technological unemployment are also emerging. By the 2050s, India's growth has begun to stagnate, as the world faces a crisis unparalleled in history. Fusion power is nearing commercial availability. A prototype commercial fusion reactor is nearing operation. DEMO (Demonstration Power Plant) is the successor to ITER and is designed to build on the success of that project, achieving a number of new breakthroughs. Among the earlier problems being tackled are containing the plasma at high enough temperatures, maintaining a great enough density of reacting ions, and capturing high-energy neutrons without melting the walls of the interior. Constructed in the 2030s and early 2040s, DEMO is now closed to being perfected. Later this decade, it will produce a sustained output of 2 gigawatts (GW), making fusion commercially available for the first time. Deep ocean mining operations are widespread. Exploitation of the ocean for its resources had for centuries been confined to fishing and coastal developments. Limits in technology made ventures into deeper waters both impractical and economically unviable. Interest in deep sea mining first appeared in the 1960s, but consistently low prices of mineral resources at the time halted any serious implementation. By the 2000s, the only resource being mined in bulk from the ocean was diamonds, and even then, just a few hundred metres below the surface. Large-scale efforts continued to be hindered in the first decades of the 21st century. By 2040, however, advances in robotics and telepresence have led to a fully mature industry – opening up the vast and previously unobtainable wealth of fuel and mineral reserves along the ocean floor. In the past, retrieval operations were limited to manganese nodules (resource-rich rock concretions found on the ocean floor) and metal-rich sediments around hydrothermal vents. Now, thanks to new extraction methods and processing techniques, even the low concentrations of elements found in mud layers are economically viable. Today, prospecting, and undersea construction is done using fleets of automated and remote-controlled robots. Once ships or mining platforms are in place, resources are brought to the surface through hydraulic suction or continuous bucket line systems. The primary focus of these current efforts is rare-earth metals. The rapid growth in demand for these elements, used in a wide range of electronics and other hi-tech applications, has been stymied in recent years by increasingly dire shortages. This has turned them into resources of strategic importance on the level of oil and natural gas in earlier decades. It has become particularly apparent in Asia, with nations such as India, Japan, South Korea, and Indonesia ramping up efforts to free themselves from the near monopoly that China holds. Another valuable, though hazardous, target of deep ocean mining is methane hydrate deposits. This so-called "fire ice" consists of concentrated methane trapped within a crystal of frozen water. It is found throughout the deep ocean in sedimentary structures and outcroppings, with some of the largest known deposits found in the seas of the West Pacific and along the coasts of North America. By itself, methane hydrate far outweighs the total amount of recoverable natural gas. Several nations have established mining operations, with Japan, China, and the United States among the largest producers. Traditional forms of ocean exploitation still exist. While the number of oil drilling platforms has declined overall, Deepwater (500-1500 metres) and ultra-Deepwater operations (1500 metres or further) continue to expand as easily recoverable reserves become scarce. Following recent spills and ecological disasters, some on the scale of the Deepwater Horizon spill of 2010, such operations remain highly controversial. In general, most ocean mining and drilling operations are targets of criticism. Concerns over the environmental impact of ocean floor dredging and prospecting have led to stricter regulations in many countries, as well as the development of protected ocean zones. Nevertheless, the impact of deep ocean mining is still considerable in many regions. Even more polarising is methane hydrate drilling, which threatens to further accelerate the pace of global warming. Despite efforts to eliminate leakage and minimise its impact, methane hydrate mining is still a risky business, with a number of countries flat-out banning it. Less than two-thirds of the original Congo jungle remains standing. The Congo region is a sedimentary basin for the drainage of the Congo River in west equatorial Africa. At the turn of the 21st century it contained a quarter of the world's tropical forests, with a total area of 2.5 million sq km. It held some of the largest undisturbed portions of tropical rainforest on the planet, second only to the Amazon in Brazil. Spanning across six countries, it was home to over 10,000 unique species of tropical plants, 30% of which could not be found anywhere else on Earth. There were over 1,000 bird species, 700 fish species and 400 mammal species. Some noted examples included the Bonobo (humanity's closest living relatives), the Forest Elephant, the okapi, Congo Peafowl, and various species of gorilla. Rare and unique frogs, bats, rodents, and birds, together with plants such as orchids, could also be found. Despite efforts to slow deforestation in the developing countries of Africa, the rainforests of the Congo Basin and elsewhere continued to recede over the decades. As early as the late 2010s, Nigeria's forests had shrunk down to essentially nothing, while the situation in Central Africa had worsened too. Alongside the bush meat trade impacting fauna, the largely unregulated logging industry continued to chip away at the flora. Interest in foreign markets led to massive mining operations being conducted in the region, dealing severe damage to its ecosystem. Untapped deposits of raw minerals and metals – estimated to be worth in excess of US$24 trillion – attracted companies in droves. The expansion of cities and construction of new dams also played a role in harming the fragile environment, while slash-and-burn farming practices began to run rampant as the population soared. Ever-increasing resource demands and the need for economic growth led governments to look the other way during much of this exploitation. This was despite an outcry from the international community and environmentalists. By 2040, climate change is having an impact too. Since the vast majority of rainfall is generated in the region itself, the resulting isolation makes it more vulnerable to global warming. A large proportion of moisture in Central Africa is produced by evapo-transpiration of trees in the Congo Basin. Substantial reductions in rainfall are now occurring. Loss of forests, especially through fires started by farmers, is pumping huge amounts of CO2 into the atmosphere. The rainforest is now transitioning from a carbon sink to a carbon source. With 66 gigatons of "volatile" carbon – and a further 50 gigatons in the rest of tropical Africa – the equivalent of five years' worth of global emissions will eventually be released. These factors have converged so that, by 2040, less than two-thirds of the original Congo remains. Prior to the arrival of human civilisation, rainforests covered somewhere between 80-85% of the total land area in the region – around 3.29 million square km (1.27 million square miles). By the mid-20th century, one-fifth had disappeared. Deforestation began to accelerate in the 21st century, due to rapid population growth and economic development. By 2020, the rainforests were declining by 0.3% each year; by 2030 this had risen to 0.5% per year and by 2040 the rate is 0.7%. In addition to extinctions of animals and plants, numerous indigenous tribes are being uprooted, their cultures disrupted and, in some cases, lost forever. There is much social and political upheaval in the region. On top of this, local resource conflicts are beginning to break out, primarily over food. This is only serving to exacerbate the environmental damage. Many areas of forest have become battle grounds, while civilian populations are forced to become more self-sufficient, turning to their surrounding local environment for resources. Congo rainforest deforestation map graph trend 2040 2050 jungle forest Africa. First generation brain-computer interfaces reached the consumer market in around 2010. This technology was crude and limited to begin with: more of a novelty than a serious application. Devices could perform only the simplest of operations, such as directional commands. Some university experiments successfully created text messages using thought power alone but were slow and required bulky equipment to do so. Advances by 2020 enabled the translation of thoughts into intelligible, recognisable speech by combining speech synthesisers and artificial intelligence – but the process remained slow and inefficient. By 2030, however, exponential progress had been made in mapping and understanding the brain and its neuroelectrical signals. This was filtering down rapidly to the consumer market. Detailed, real-time messages were becoming possible, using non-invasive methods. The graphical interfaces used in composing messages had also been much improved, with more intuitive navigation and features. By 2040, the technology is largely perfected for everyday use. It works well and is cheap enough to have spread to even developing countries. Privacy and security issues have been resolved, with personal firewalls able to restrict any unwanted intrusion or hacking attempts. The headsets, visors, and earphones necessary for users have been miniaturised and made more comfortable. Some are even fully implantable. Whether for business or personal use, people everywhere are now enjoying a faster, more sophisticated, more private way of communicating. This form of "virtual telepathy" – and the convergence of other network-based technologies – is radically reshaping society and culture during this time. A speculative bubble is formed on the stock markets, with investors everywhere forecasting a revolution in telecoms. This temporarily overheats the economy, resulting in a crash similar to that of the dotcom collapse of early 2000. Biorepository and genomic information systems are transforming healthcare. By now, most countries have established a national biorepository and genomic information system, with mandatory entry for all citizens. In other words, governments have a genetic sample of every person. This is needed for a variety of reasons – from national security to public health, citizen ID, immigration control, resolution of crimes and more – but the most common use is in healthcare. These genomic information systems are integrated with electronic health records and personal health records, allowing identification and treatment of disease and healthcare issues at the earliest opportunity. Hard data from these systems allow doctors and surgeons to better treat their patients, while government and researchers can target time and resources more efficiently. By utilising such a broad spectrum of information, medical schools and healthcare providers can train and employ the best possible mix of specialists for their patient population. The focus of healthcare has shifted in recent years – to preventative methods, as opposed to reactionary methods after a disease state has occurred. As well as saving more lives, this has major economic benefits too. By now, the average person is using at least one biotechnological implant. Once again, these devices are tailored to their exact personal health requirements. For example, they can be programmed to monitor specific conditions and to dispense medication when needed while simultaneously notifying a doctor. They can identify a patient who is unconscious or unable to communicate for whatever reason, providing vital clinical information during an emergency. They can also be used as tracking devices for mental patients or those suffering from neurological conditions. Pollen counts have more than doubled. In 2000, pollen counts for the US averaged 8,455 per cubic metre of air. By 2040, this figure has risen to 21,735 – largely due to climate change which has caused major alterations in weather, precipitation, and temperature. Alongside this, the hay fever season has shifted to earlier in the year, with pollen counts now peaking on 8th April, compared to 1st May at the start of the century. Similar changes have taken place in countries around the world. Thankfully, new treatments are now available to prevent allergic reactions. Recent years have seen major advances in gene therapy, for instance. These drugs can "repair" the DNA of hay fever sufferers. Tobacco has been largely eradicated. In the USA, tobacco use peaked in the early 1960s with nearly 45% of adults smoking regularly. As the health risks became more apparent, efforts were made by government, public health advocates, grassroots organisations, and others to raise awareness. These campaigns were remarkably successful in stemming the rates of smoking and tobacco-related disease and death. Smoking was banned in aeroplanes, office buildings and later in public locations such as bars and restaurants. Strict laws on the advertising of tobacco products and their use in movies and television were also introduced. In addition, improvements were made in the availability and efficacy of smoking cessation aids and pharmaceuticals. By the early 90s, the number of US adult smokers had plunged to 25% and by 2010 the figure was down to 20%. By 2020, smoking in public was banned across every US state and in many other countries around the world, with smoking rates continuing to decline. Efforts continued over the following two decades and once again proved to be highly successful. The costs of government interventions were surprisingly small, less than 50 US cents per person per year in countries such as India and China. By 2040, less than 5% of the global population is smoking. Life expectancy for cystic fibrosis reaches 70. Cystic fibrosis (CF) is a genetic disorder that affects most critically the lungs, and also the pancreas, liver, and intestine. It is characterised by abnormal transport of chloride and sodium across an epithelium, leading to thick, viscous secretions. The name cystic fibrosis refers to the characteristic scarring (fibrosis) and cyst formation within the pancreas. Difficulty breathing is the most serious symptom and results from frequent lung infections that are treated with antibiotics and other medications. Other symptoms, including sinus infections, poor growth, and infertility affect other parts of the body. When the disease was first described in 1938, survival beyond infancy was rare. In 1952, Paul di Sant' Agnese found abnormalities in sweat electrolytes; a sweat test was developed and improved over the next decade. Despite new treatments – including lung transplants – life expectancy for those affected by the condition remained low throughout the 20th century. By the 1980s, it was still in the twenties. A major breakthrough was achieved in 1989, however, when the trans-membrane conductance regulator gene was discovered. Subsequent research uncovered thousands of different mutations affecting the gene. As our knowledge of the underlying molecular causes and ways of treating the illness continued to improve, so too did life expectancy. Following several milestones in research, it has reached 70 by 2040. Breakthroughs in carbon nanotube production. After decades of research, new processes have been developed for synthesising carbon nanotubes, promising to revolutionise the fields of engineering, architecture, and materials science. Having been limited to a few centimetres, these structures can now reach potentially thousands of miles in length. Purification techniques ensure maximum tensile strength, making them hundreds of times stronger than steel. Among the many applications, the technology for a space elevator is now available. Political and financial will are the only remaining obstacles for such a project. 2040 carbon nanotubes nanotechnology future space elevator. Submarine exploration of Titan. The first probe to Saturn was Pioneer 11 in 1979, which confirmed that its largest moon Titan was probably too cold to support life. This was followed by Voyager 1 and 2, in 1980 and 1981, respectively. Cassini–Huygens was launched in 1997, arriving in 2004, with a lander that returned the first pictures of Titan's surface in 2005. In subsequent years, a number of conceptual missions were proposed for returning probes to Titan. Of particular interest were the moon's hydrocarbon lakes and oceans, thought to have conditions similar to those on Earth during its early history. Most of NASA's budget and objectives had already been assigned for the next two decades. However, the NASA Institute for Advanced Concepts was established for longer term, visionary goals. Among the projects to emerge from this program was an unmanned submarine intended to explore the subsurface environment of Titan. This began to progress from initial feasibility studies to more detailed and practical designs. As the years went by, mission capabilities were being enhanced by a new generation of robotics – some aspects of which could be seen in the deep ocean mining operations now appearing on Earth – while access to space was now a fraction of the cost it had once been. A launch was timed to coincide with Titan's summer during the early 2040s, maximising the period of ice melt and ease of manoeuvrability. The vessel would be delivered to Kraken Mare, a huge lake of methane and ethane approximately 1,000 ft (300 m) deep. This unpiloted submarine features onboard real time navigation, hazard avoidance systems, an exploration sensor suite, and autonomous science investigation software system. Stunning high-quality videos and a plethora of data are returned from this strange alien environment, where temperatures reach below -179°C (-290°F). China's HSR network has been greatly expanded. China's rapid economic growth in the early 21st century was aided by its massive investments in infrastructure. Highways, bridges, tunnels, and airports quickly spread throughout the country, linking nearly every major city and regional province, while 15,000 new cars were added to the nation's roads each day. Above all, however, it was high-speed rail that proved to be the driving factor in much of China's rise. Similar to the industrial revolution of 200 years previously, rail provided growth and increased prosperity to every area it connected to. Between 2010 and 2020, China invested $300 billion in constructing over 17,600 km (11,000 mi) of additional rail lines, giving 90% of the population access to the network. From the 2020s onwards, there was further expansion of high-speed rail, the surge in passenger numbers making HSR a lucrative industry. Along with being more energy efficient and cheaper, advances in design and technology boosted train speeds by hundreds of miles per hour, making them competitive with flight schedules in many cases. The very fastest routes now included trains travelling at over 1,000 km/h (625 mph). Maglev routes were expanded significantly, especially along the coast. Along with internal connections, plans were formulated to link the Chinese rail system with those of Europe, India, Russia, and Japan. With such a huge rail network, the cities of China were more closely connected than ever before. In a sense, high-speed rail created a 1.2 billion person "single city" effect, with much of the population only a few hours away from each other. Along with growth in commerce, rail has driven – and in turn been driven by – China's unprecedented urbanisation. By 2040, over 70% of the population lives in urban areas. Vast megacities, each with more than 100 million people, have formed out of the gradual merging of smaller metropolises. The largest examples today are the three main economic zones: the Yangtze River Delta (Shanghai, Nanjing, and Hangzhou), the Pearl River Delta (Guangzhou, Shenzhen, and Hong Kong) and the Bohai Economic Rim (Beijing, Tianjin, and Tangshan). Despite all this, China's economy has begun to weaken significantly in recent years. With a declining workforce and with most of its growth fuelled by debt, the country is now embroiled in political, economic, and social strife. Restructuring and artificial inflation had managed to sustain the situation temporarily but could only do so much. Worsening climate change is now an additional factor. This is a particular problem in Shanghai, which has been woefully unprepared for sea level rises. Though still experiencing moderate local growth, the country as a whole is now approaching crisis point. By the end of this decade, it will have largely stagnated, becoming one of the last major powers to do so. Completion of the Northeast Corridor high-speed rail route. By 2040, work is nearing completion on a major upgrade of the Northeast Corridor (NEC). America's busiest rail line, the NEC runs from Boston in the north to Washington in the south, via New York. Like many rail services in the US, it had seen decades of underinvestment. Much of the infrastructure was poorly managed and in need of renovation. Tunnels, for example, had speed restrictions due to their obsolete designs, while electrical components dating from the 1930s would routinely fail. There were engine breakdowns, conflicts among trains and frequent delays costing tens of millions of dollars in lost productivity. Between 2000 and 2010, intercity ridership on the NEC jumped from 8.2 to 13 million passengers a year. In an effort to address future capacity needs, improve service reliability and reduce travel times, Amtrak formulated plans for a $150 billion, 30-year investment program. This would see construction of a dedicated high-speed route, with trains running up to 220 mph (354 kph). The plans include fully upgraded tracks and signals, new tunnels, new bridges and expanded stations. Tracks follow the existing NEC and transport networks whenever possible to minimise impacts. Implemented in three main phases, the Newark to New York section is completed by 2025; the Washington to Newark section is completed by 2030 and the final section between New York and Boston is completed by 2040. Journey times are dramatically reduced. A trip from Boston to New York that previously took 3 hours and 34 minutes can now be completed in just 1 hour and 34 minutes. A trip from New York to Philadelphia is reduced from 1 hour and 10 minutes to just 37 minutes, while a trip from Philadelphia to Washington is cut from 1 hour and 33 minutes to just 54 minutes. For passengers travelling the entire 438 miles (705 km) from Boston to Washington, this means a total reduction in journey time of more than 50% – from 6 hours and 17 minutes to 3 hours and 5 minutes. Following many years of neglect, this region of the United States finally has a world class rail system. All civil domestic aviation in Norway is electric. In the late 2010s, Norway's state-owned airport operator Avinor proposed an upgrade of the country's short-haul airline fleet – intended to transition it from traditional jet fuel to electric planes. This idea received further attention in 2020, when Avinor and the Civil Aviation Authority released a report exploring these long-term plans in more detail. Norway already had more than 200 research and development projects underway for electric or hybrid-electric passenger aircraft, with a particular focus on smaller planes of less than 20 seat capacity. The aviation authorities, collaborating alongside domestic airlines, used their knowledge and experience of the region to develop zero-emission aircraft suited to Norwegian winter conditions and runway lengths on the short-haul network. Advances in battery technology offered the potential to fly aircraft longer distances on a single charge. This included a new generation of solid-state batteries with 650Wh/kg of energy density, compared to 250Wh/kg for conventional lithium-ion cells. These devices, embedded in the fuselage and wings, led to a first fully electric domestic flight by 2030. Further progress in the 2030s resulted in additional range extensions, opening up new routes. By 2040, these are sufficient to cover Norway's entire short-haul domestic network and form connections to neighbouring Scandinavian cities. Prior to this transition, air travel on domestic routes accounted for 2.4% of Norwegian greenhouse gas emissions. Other benefits of these new planes include a halving in the level of noise pollution, along with lower operational and maintenance costs. Norway serves as a model for other countries and airlines. In subsequent decades, even greater energy densities become possible, enabling larger aircraft with higher seating capacities and longer routes to incorporate purely electric systems. A pan-European hydrogen network has emerged. By 2040, much of western Europe has established a "backbone" for the transport of hydrogen, now a mature industry. This consists of a pipeline network with a total length of almost 23,000 km and supplying up to 1,130 TWh of annual hydrogen demand. It runs through nearly a dozen countries and forms a major part of decarbonisation efforts across the continent. The cost of building this network – with an upper estimate of €64 billion – is relatively low. About 75% of the planned infrastructure already exists in the form of redundant natural gas pipelines, more and more of which have been retrofitted, as fossil fuel volumes continue to decline. The amount of electricity required for transporting hydrogen over a distance of 1,000 km is only 2% of the energy content of the transported hydrogen, making it highly efficient and cost-effective. The emergence of this large-scale infrastructure has made it easier to scale up both the production and use of hydrogen. Although outnumbered by the battery electric sector, the market for hydrogen-powered vehicles has grown substantially. This includes ground vehicles such as buses, cars, trains, and trucks, but also now a significant percentage of aircraft and ships. Meanwhile, industrial processes that once relied heavily on fossil fuels – such as steelmaking and rolling, chemical production and so on – are making a rapid transition to clean technology, including hydrogen. The heating of homes, workplaces, and other buildings is also increasingly done with hydrogen. Furthermore, hydrogen can be used to store energy from renewable resources, such as wind and solar, to help balance daily and seasonal fluctuations in supply and demand. A group of European gas companies first presented their proposal for the hydrogen backbone in 2020. Development began in the mid-2020s, with an early pipeline network of 6,800 km connecting local clusters (so-called "hydrogen valleys") by 2030. The next stage involved consumers in the centre of the continent being connected to regions with abundant green hydrogen resource potential, including from Danish offshore wind resources, as well as solar and wind resources from the south of France. By 2040, the backbone has further expanded, to become a truly pan-European hydrogen network allowing connections with global import routes, such as from North Africa and the Middle East.

2041: Global average temperatures have risen by 2°C. At the UN Climate Change Conference of 2009, a rise of 2°C was agreed as the maximum "safe" limit for the global average temperature, beyond which it would start to become uncontrollable and catastrophic. In the early 2040s, this danger point is passed. This occurs despite the ongoing decline in fossil fuel production since emissions from earlier decades are yet to have their full effect on the climate system. In other words, while a transition to clean energy is being achieved, global warming remains a deadly threat to civilisation. The cumulative impact of greenhouse gas emissions is enormous, with hundreds of gigatons requiring sequestration from the atmosphere and oceans. It should be noted that 2°C is merely the average global increase. In some regions, such as the poles, the rise has been far greater already. The Arctic is now completely free of sea ice for most of the year, while Greenland will soon be approaching a tipping point of irreversible melting. In America, the arid conditions in the Southwest have continued to worsen. They are now spreading to South-eastern states, where soybean production has been slashed by half, and a similar yield decrease has occurred for sorghum. Meanwhile, invasive species of insects are migrating to new latitudes, driven by the increasing temperatures. Bark beetles, for example, are moving north and killing off huge areas of forest that provide food to grizzly bears and other fauna. In Europe, the Alps are becoming largely devoid of snow, for the first time in millions of years. Having served a role as the "water towers of Europe", this is having a serious impact on water supplies. Major rivers, like the Rhine, Rhone, and Danube, have until now relied on snow and glacial melt from these mountains. Switzerland is being especially hard hit, with much of its electricity based on hydroelectric power. In addition, record heatwaves are causing gigantic wildfires the likes of which have never been experienced before. The Mediterranean has lost a fifth of its rainfall and now has an additional six weeks of heatwave conditions each year. At the foot of the Alps, rockfalls triggered by melting permafrost have caused widespread destruction to villages and towns. With skiing impossible in many areas, tourism is being hit hard. In South America, a similar situation has occurred. Melting glaciers in the Andes Mountains have led to water shortages for tens of millions of people, resulting in large-scale displacements. These refugee movements are now a major issue for the region. In Columbia, there has been a marked decline in coffee production – one of the country's main exports – accounting for a significant percentage of world harvests. Asia too has a water crisis. Pakistan's major rivers – the Indus, Jhelum, and Chenab – are delivering under half their historic supply. The nuclear-armed country is now at war with neighbouring India, after conflicts over territory and resources. Monsoon rainfalls have become increasingly unpredictable in the region. Meanwhile, sea level rises have caused further devastation to Bangladesh, which has yet to recover from the disasters of earlier years. Developing regions are disproportionately affected by climate change, and Africa is the worst-hit location of all. Biblical-scale droughts are becoming the norm here, with much of the continent hit by serious declines in agricultural yields. In Mali, three-quarters of the population is starving. In the Western Pacific, Tuvalu is now sharing the same fate as the Maldives: much of the island nation has been inundated. The evacuations from here and other low-lying regions are now a regular feature on the news. Annual deaths from cardiovascular disease have reached negligible levels in the U.S. Cardiovascular disease refers to any disease affecting the cardiovascular system, principally cardiac disease, vascular diseases of the brain and kidney, and peripheral arterial disease. The causes are diverse but atherosclerosis and/or hypertension are the most common. Additionally, with aging come a number of physiological and morphological changes that alter cardiovascular function and lead to subsequently increased risk of cardiovascular disease, even in healthy asymptomatic individuals. In the early years of the 21st century, cardiovascular disease was the leading cause of mortality worldwide – responsible for nearly 30 percent of total deaths annually. In low- and middle-income countries it was increasing rapidly with four-fifths of cases occurring in those regions. In high-income nations, however, cardiovascular mortality rates had been falling since the 1970s, due mainly to public health efforts and improved medical treatments. A dramatic reduction in tobacco use (which included smoking bans) – alongside recommended limits on alcohol, fat, and sugar intake – as well as recommended minimum daily exercise, were among these prevention methods. This trend began to accelerate as a range of new treatment options became available in the 2010s and 2020s. These included stem cells and heart muscle regeneration, microRNA inhibitors to prevent heart enlargement, gene therapy and drugs to treat obesity, 3D printed organs and vessels, nanoparticles, and nanorobotics. By the early 2040s, mortality rates for cardiovascular disease have dropped to negligible levels in the U.S. and many other countries. Oil spills in the Niger Delta have been cleaned up. The Niger Delta is the delta of the Niger River that drains into the Gulf of Guinea off the coast of Nigeria. For millions of years, organic sediments were deposited from the river into the Atlantic, which became crude oil. During the early 21st century, this region was among the world's top oil and gas exploration hotspots. The first drilling operations began in the 1950s, undertaken by multinational corporations that provided Nigeria with the necessary technology and financial resources for extraction. In 1971, Nigeria joined the Organisation of Petroleum Exporting Countries (OPEC). From 1975 onwards, the Delta region accounted for more than 75% of the country's export earnings. Nigeria became Africa's biggest producer of petroleum and was ranked among the top 10 nations globally in terms of proven reserves. At its peak, nearly 2.5 million barrels were being extracted a day. It was estimated that 35 billion barrels lay waiting to be discovered, enough to last several decades. However, the Niger Delta became a centre of controversy over pollution, corruption and human rights violations. Many citizens of Nigeria felt exploited and unable to see the economic benefits of oil companies in the state. Production was affected by political instability and sporadic supply disruptions, attacks on infrastructure and crude oil theft, as local groups sought a share of the wealth. Most of the oil fields were small and scattered. Nearly 160 were found across Nigeria – of which 78 (almost half) lay in the Delta. As a result of the numerous small fields, an extensive pipeline network had been engineered to transport the crude oil and this was vulnerable to sabotage. Many sections of pipeline were also poorly maintained and badly aging. Pipeline explosions killed thousands of people and left many others with serious burn injuries in the 1990s and 2000s. Oil spills were frequent in the region and often devastating to communities based around fishing and farming. A report by the United Nations (UN) found that in one community, families were drinking from wells containing benzene, a known carcinogen, at 900 times the recommended levels. The Nigerian National Petroleum Corporation reported an average of 300 individual spills annually. However, as this amount did not consider "minor" spills, the World Bank argued that the true quantity of petroleum spilled into the environment could be as much as ten times the officially claimed amount. In addition, gas flaring was a major issue and contributed vast volumes of air pollution and greenhouse gases. Much of the excess waste from the Delta was immediately burned, or flared, at a rate of approximately 70 million m³ per day – enough fuel to provide the combined annual natural gas consumption of Germany and France. Despite regulations introduced to outlaw this practice, it continued for decades at many drill sites. As the largest wetland in Africa, the Delta was an incredibly rich ecosystem containing one of the highest concentrations of biodiversity on the planet and supporting abundant flora and fauna, arable terrain, and a wide variety of crops. The numerous oil spills and gas flaring in much of the Delta were taking a heavy toll on the environment. Pollution was affecting the air, water, soils, animals, vegetation, and even physical structures. Ken Saro-Wiwa – a Nigerian writer, TV producer, and environmental activist – brought attention to these problems by leading a nonviolent campaign against the degradation of land and waters by the multinational petroleum industry, especially Royal Dutch Shell. He was also an outspoken critic of the Nigerian government, which he viewed as reluctant to enforce regulations on foreign companies operating in the area. He led the Movement for the Survival of the Ogoni People (MOSOP), an indigenous group living in the Delta. In 1993, MOSOP organised peaceful marches of 300,000 Ogoni people, more than half of their population, drawing international attention to their plight. Shell withdrew from Ogoniland, in a major victory for the local residents. However, Nigeria's government had recently occupied the region militarily and took decisive action against what it saw as an increasing threat. Thousands of Ogoni people were tortured and killed, and dozens of villages destroyed. Ken Saro-Wiwa himself was tried by a special military tribunal at the peak of his non-violent campaign and charged with masterminding the murder of Ogoni chiefs at a pro-government meeting, in a trial widely criticised by human rights organisations. In 1995, he was hanged, along with eight other activists, by the military dictatorship of General Abacha. Many of the supposed witnesses later admitted they had been bribed by the government to support the criminal allegations. Two witnesses who testified that Saro-Wiwa was involved in the murders later recanted, stating that they were offered money and jobs with Shell to give false testimony, in the presence of Shell's lawyer. The executions provoked international outrage and resulted in Nigeria's suspension from the Commonwealth of Nations. Saro-Wiwa's death was a major setback for the environmental movements, but their efforts to seek justice and compensation would continue over the subsequent two decades with ongoing lawsuits and other actions. While social and political unrest persisted into the early 21st century, Nigeria moved to a more democratic, civilian federal system. In 2011, a UN report was funded in part by Shell after a request by Nigeria's government. This stated that Nigeria's Ogoniland would take 30 years to fully recover from the damage it had sustained, at a cost of $1bn. Then, in 2013, a Dutch court ruled that Shell was liable for pollution in the region. The company was sued repeatedly by local communities with claims running into many millions of dollars. A major breakthrough in resolving the situation was finally achieved when the Nigerian government – in partnership with oil companies – agreed to act on the recommendations of the UN report. In 2016, a full-scale clean-up and restoration plan was officially launched by President Buhari. This initiative, lasting for approximately 25 years, would start with $200m of funding over a period of five years, focussed on a 1,000 sq mile (2,600 sq km) area of land and water near Port Harcourt, the capital and largest city of Rivers State. A factory would be constructed to process and clean tens of thousands of tons of contaminated soil. Alongside this, a mass replanting of mangroves would be undertaken. After this initial phase, the project would gradually expand over the next 20 years with more funding and resources. This would fully restore all of the remaining land, creeks, fishing grounds, mangroves, swamps, and other areas devastated by Shell, the national oil firm and other fossil fuel companies. Thousands of jobs would be created for engineers, manual workers, project managers and inspectors. In addition to ecological repair, a side benefit would be that young people in the Ogoni region (many of whom had rebelled against and sabotaged the oil infrastructure) could now be put to work doing productive and rewarding tasks. In the medium to longer term, it was hoped that a healthier environment would create a more socially and politically stable region – leading to economic progress and sustainable development. This would improve the overall living standards of Nigeria, one of the most rapidly growing countries in terms of population. The plan was not without its problems, of course, with ongoing conflicts in the region, alongside concerns over corruption. However, there was a certain momentum and inevitability to the process, as oil production was declining regardless of any clean-up operations. Solar was becoming so cheap and widespread that it rapidly gained a foothold in many African nations including Nigeria, forming a substantial percentage of energy capacity within just a few decades. By the early 2040s, restoration efforts in the Niger Delta have been largely completed, while economic diversification has allowed Nigeria to transition away from its older fossil fuel industries and to attract foreign investment in new areas. Nigeria is now facing an even greater threat, however, in the form of climate change. Between the years 1900 and 2000, average annual rainfall in the country declined by 10%, from 1400mm to 1255mm. This trend continued in the 21st century and has dropped another 5% to below 1200mm by 2041. The next challenge for Nigeria – and indeed much of Africa – is to increase its access to water. Thankfully, new technologies such as nanofiltration and other extraction techniques are now making this easier. Although Nigeria still faces environmental problems and overpopulation, its outlook is less doom-laden than some had previously feared. Africa as a whole is becoming an increasingly important part of the global economy. Orbital solar power is commercially feasible. After decades of development, energy generated from space-based solar power is now being added to many grids. This concept has been around since the 1970s – but advances in nanotechnology and transmission efficiency have only recently made it both commercially and technically feasible. The system involves placing several large satellites into geosynchronous Earth orbit. Initially, this is financed and carried out jointly by government agencies and private corporations. Very large, nanotech-based surfaces on each satellite's solar array (typically 1 to 3 kilometres in size) capture the energy of sunlight, which is then beamed down to Earth via microwaves or lasers. Large collecting dishes on the ground receive the energy and convert it to useable electricity. There are several benefits to this approach: Higher collection rate: In space, transmission of solar energy is unaffected by the filtering effects of atmospheric gases. Consequently, collection in orbit is 144% of the maximum attainable on Earth's surface. Longer collection period: High above the Earth, orbiting satellites can be exposed to a consistently high degree of solar radiation, generally for 24 hours per day, whereas ground-based panels are restricted to around 12 hours per day at most. Elimination of weather concerns: Orbiting satellites reside well outside any atmospheric gases, cloud cover, wind, rain, and other potential weather events. Elimination of plant and wildlife interference. Redirectable power transmission: Satellites can direct power on demand to different surface locations based on geographical baseload or peak load power needs. The climate benefits from orbital solar power as well since there are no greenhouse gas emissions (though the energy beamed down to earth is eventually lost as heat). These projects are initially expensive though, due to the hostility of the space environment. Panels require high strength shielding to protect against space junk and their huge surface areas can make them vulnerable to incoming debris. Some of the more hi-tech stations feature nanotechnology-based composites that can self-heal. Degradation of the solar panels comes close to making them uneconomical at first, though further advances in technology later solve this issue. Though far from a perfect beginning, space-based solar power grows to become a hugely successful industry in the late 21st and 22nd century. Satellites also begin to appear in orbit around the Moon and Mars, greatly boosting the energy available on manned bases. It continues to grow around Earth for almost two centuries, until virtually all of the sunlight falling on the planet is being captured and harvested in some way. Supercomputers reach the yottaflop scale. By the early 2040s, the world's most advanced supercomputers have reached the yottaflop scale – a magnitude of processing power that enables a trillion, trillion floating-point operations per second. This is 1,000 times faster than a zettaflop machine of 2030 and a million times faster than the exaflop machines of 2019. In earlier decades, experts had expressed concerns that Moore's Law – the trend of exponentially increasing computer speeds – was beginning to slow. However, these fears proved to be overstated. While it was true that a slowdown occurred in the 2010s, this was only a temporary blip, as new breakthroughs were being achieved in a number of areas. For example, traditional silicon microchips would soon be replaced by a new paradigm in the form of carbon nanotubes, able to be scaled down to even smaller sizes while greatly improving the speed and energy efficiency of transistors. Other novel concepts were emerging – such as optical computers, based on photons instead of electrons, creating a new generation of dramatically cooler and more energy efficient systems. Quantum computers and related technologies   were also providing new ways to overcome barriers to speed and power. All of these innovations paved the way to exaflop, zettaflop and eventually yottaflop computers. By 2041, the available processing power is sufficient to model thousands of human brains, in real time, at the neuron level. In recent years, the level of simulation model scale has also reached into electrophysiology, with metabolomes and proteomes soon to follow, and the states of protein complexes during the early 2050s. This produces major insights with regards to the study of mental illness, for example, and other aspects of human neurology. These advances continue to increase by orders of magnitude through the remainder of the 21st century – culminating in truly accurate brain simulations and mind uploading in the early decades of the 22nd century. Completion of the W350 tower in Tokyo. The W350 tower is a 70-storey, 350m (1,150 ft) skyscraper built by Sumitomo Forestry in Tokyo, Japan. The building is highly unusual in that its structure consists largely of wood. It is the tallest such building of its kind in the world and over six times taller than Brock Commons – the previous record holder at the time of its announcement – a 53m (174 ft) wooden residence in Vancouver, Canada. The W350 is also taller than Tokyo Tower (one of the city's most famous landmarks that opened in 1958) and is therefore a major addition to the skyline. The building contains a mix of hotel, office, residential and retail space. It is covered in greenery, flooded with natural light, and features open-air balconies that continue around all four sides, creating a healthy living environment for the occupants. Unlike concrete and steel buildings (which are responsible for 8% and 5% of carbon emissions, respectively), wooden buildings store carbon. The W350 tower removes 100,000 tons of CO2 from the atmosphere. W350 is part of an emerging trend in eco-architecture that is becoming far more prominent around this time, now that environmental considerations are being brought to the fore. At the time of its announcement, the total cost was about 600bn yen ($5.6bn), twice that of a conventional skyscraper of the same size. However, advances in technology led to reductions in the project's cost, along with improvements in fire resistance. Its completion date is 2041 – timed to coincide with the 350th anniversary of Sumitomo Forestry. Cases of lung cancer have spiked in New York. During the terrorist attacks in New York on 11th September 2001, the twin towers of the World Trade Centre collapsed into a pile of rubble. A huge volume of debris was pulverised and scattered over the surrounding area. This included 400 tons of asbestos within each tower. Asbestos was banned in New York in 1972, shortly after the construction of each tower. Mid-way through their completion, it was known that a ban would be coming into force in the near future. Nevertheless, the decision was made to retain the hazardous material throughout 20 floors of the buildings. It was estimated that 410,000 people – more than one in twenty of the city's population – were exposed to asbestos and other toxic substances embedded in the towers' drywall, insulation, fireproofing and steel structures. The massive cloud of smoke, dust and debris released from Ground Zero affected mainly emergency crews and other rescuers, along with those responsible for clearing the site. However, many others were exposed too, including those in the immediate vicinity who were unable to flee the clouds in time, and local residents living or working nearby. In the months and years following the attacks, growing numbers of New Yorkers reported symptoms of Ground Zero respiratory illnesses. The dust and debris had been "wildly toxic", according to one air pollution expert. Studies found more than 2,500 contaminants from the towers: 50% in non-fibrous material and construction debris; 40% from glass and other fibres; 9.2% in cellulose; and 0.8% from the deadly asbestos, as well as lead and mercury. There were also unprecedented levels of dioxins and PAHs from the fires which burned for three months. Many of the dispersed substances (asbestos, crystalline silica, lead, cadmium, polycyclic aromatic hydrocarbons) were carcinogenic; others led to kidney, heart, liver, and nervous system deterioration. A case report funded by the Centres for Disease Control and Prevention (CDC) and the National Institute for Occupational Safety and Health (NIOSH) identified carbon nanotubes in dust samples and in the lungs of several 9/11 responders. This led to increasing numbers of debilitating illnesses among the surviving rescue and recovery workers, as well as some residents, students and workers of Lower Manhattan and nearby Chinatown. In 2006, it was reported that dozens of recovery personnel had developed cancer – with doctors and epidemiologists confirming these cases as linked to the Ground Zero exposure. The following year, the pulmonary fibrosis death of NYPD member Cesar Borja was reported. In 2010, a study of 5,000 rescue workers found that all had impaired lung functions, with an average impairment of 10%. Up to 40% of these workers were reporting persistent symptoms and 1,000 of the group studied were on "permanent respiratory disability." Another study in December 2012 showed that incidences for prostate cancer, thyroid cancer, and multiple myeloma were significantly elevated among the rescue and recovery workers. More and more deaths were being reported as linked to 9/11, along with lost pregnancies numbering at least ten. Many people were filing lawsuits to seek monetary compensation. There were psychological effects too, in addition to physical problems: studies were finding that exposure to the attacks was a predictor for the development of post-traumatic stress disorder (PTSD). Longer term, new health impacts emerged. For some illnesses related to asbestos exposure, the full effects and symptoms would not be felt for decades. This was the case with certain types of cancer, such as mesothelioma and lung cancer. By 2021, a significant uptick in cases was being observed and this trend continued to worsen, reaching a peak in 2041 – a full four decades after the 9/11 attacks. Although some advances have been made in treating lung cancer by now, survival rates have improved at a slower pace than most other types of cancer.

2042: Global population reaches 9 billion. For the vast majority of human history, the Earth's population stayed below 100 million, and life expectancy was short. Between the mid-19th and early 21st century, however, it mushroomed exponentially. From 1812 to 1930, the number of people on the planet doubled from one to two billion. It took just 30 years to reach the next billion and a mere 14 years to reach the billion after that. Population growth hit its maximum rate in 1963 when it peaked at 2.2% per annum. All around the world, there were unprecedented improvements in mobility, personal income, and general quality of life. The global population continued on its upward trajectory in the 21st century, reaching 9 billion by 2042. However, a raft of new social, political, financial, demographic, environmental and other problems were now having serious impacts on economic growth and prosperity. Automation had made vast swathes of jobs redundant, for example. Many countries had built up enormous debts, particularly in the West. Furthermore, humanity's ecological footprint was too large, and the Earth too small, to support the kind of materialistic lifestyles and throwaway culture that many had taken for granted in the past. The days of rampant consumerism were coming to an end, with people forced to adapt and evolve to new systems, often with heavy government intervention. It was almost as though civilisation was feeling a "hangover" from the boom times. Though life became hard for many, and plagued with uncertainty, people and communities learned to live through each new crisis. Britain, for example, took on a "Blitz spirit" and improved its self-sufficiency in food production – emulating what Cuba had achieved in the 1990s. This involved a combination of land conversion (with unused sites commandeered for crop growing) and urban agriculture. Extensive recycling of metals, plastics, glass, electronics, and other useful materials became mandatory, with strict penalties for wastage. The rationing of water became the norm in some regions, as droughts became ever more frequent. Amidst a surge in refugees, tight immigration quotas were enforced, with Britain pulling up a drawbridge and admitting only the most highly skilled foreign workers. Similar programs were enacted in nations around the globe. These measures were the only way to ensure society's survival, in light of the accelerating environmental and resource decline. Social media, communications and information technologies aided in the transition – helping to organise people and community efforts. From the economic wreckage of earlier decades, new forms of socio-economic progress were beginning to evolve. The world was gradually becoming more localised and decentralised. Though Britain was successful in this transition, many other nations were not so lucky. By 2042, resource wars have plunged some regions into chaos. Parts of southern Europe, Africa and the Middle East are dependent on ever-increasing levels of foreign support. Thankfully, population levels begin to plateau in the latter half of the century. Together with ongoing technology advances, this offers hope for longer-term solutions to humanity's problems. White people are a minority in the USA. America is a country founded on immigration. Today, its population is more diverse and multicultural than ever. Following the 1965 immigration reform (which grew from the civil rights movement), the number of non-white people increased dramatically. This was particularly true of Latinos, who went from 6.3m in 1960 to over 50m by 2010. By the early 2010s, non-whites had already begun to outnumber whites in California, Hawaii, New Mexico, Texas, and Washington DC, while nearly half of all children in the nation were non-white. This trend continued over subsequent decades. By 2042, white people themselves are a minority. This rapid change in the demographic makeup has significantly altered the political disposition of the country. Latinos, blacks, and other minorities tend to be left leaning. Other factors have influenced voters' preferences – such as the growing urbanisation of the country, with cities tending to favour more liberal and progressive policies than smaller, traditional rural communities. Generation X and Generation Y (the latter now entering their middle age) have also reshaped the political stage, most of them favouring the Democrats. This and other factors have converged to make the old-style Republican Party unelectable. By now, the GOP has been forced to drastically moderate its policies and rhetoric compared to earlier decades. 16K virtual reality is mainstream. By the early 2040s, the audio-visual aspects of virtual reality (VR) have been largely perfected for mainstream users. Gaming and other applications now provide near-fully immersive experiences of both real world and imaginary places. Most headsets now come with 16K as standard, which is quadruple the quality of 8K displays from 2030 and more than 16 times the pixel count of 4K devices from 2020. The 16K format offers a photorealism that is practically impossible for the human eye to distinguish from the real world. Meanwhile, a mid-range graphics processing unit (GPU) of 2042 has more than 10 petaflops of processing power, or about 1,000 times the equivalent product of 20 years earlier. Brain-computer interfaces (BCIs) have similarly improved by orders of magnitude. The somewhat niche and experimental BCIs of the 2020s and 2030s are being replaced by much more sophisticated versions, making brain signal data less noisy, aided by intelligent algorithms and improved sensor and wireless technology. This includes bidirectional read/write links to manipulate a user's peripheral vision and expand their field of view, so the feeling of wearing a headset disappears and they are placed entirely "inside" the game. A player can move a character's limbs by merely thinking about them, while performing actions that are far more complex than with BCIs of previous decades. This sense of immersion is enhanced in other ways – such as improved noise-cancelling techniques in the latest earphones, to block external sounds and further isolate a user from the real world. Audio in general is richer, more atmospheric, more varied, and realistic, with sound waves manipulated in exquisite detail as they bounce around virtual environments and are influenced by different objects, surface textures and weather conditions. In addition, gone are the days of repetitive dialogue by non-player characters (NPCs), which can now generate and sustain natural conversations of indefinite length. This variety and realism are apparent in character models too, which can use generative adversarial networks (GANs) to improve faces, expressions, clothing, and fluidity of movements. While a certain 'uncanny valley' remains, the overall quality of graphics has now reached a breathtaking level, especially when experienced in 16K virtual reality. The processing power of the latest CPUs and GPUs is allowing these NPCs to be placed in extremely complex and lifelike worlds. This is beautifully showcased in development platforms such as Unreal Engine 8.0, three generations more advanced than version 5.0 in 2020. For instance, an entire forest can now be rendered with sub-millimetre precision, showing unique and random features such as individual leaf marks or insect bites. Large cities are fully explorable in terms of shops, bars, and other interiors, allowing a player to enter and meet with AI that responds in nuanced ways – as opposed to the limited number of venues and interactions in games of earlier decades. This combination of 16K resolution, petaflops of processing power, human-like AI, more advanced BCIs and audio techniques has allowed virtual reality to reach a whole new plateau of sophistication and capabilities. New form factors have emerged too, such as contact lenses with embedded displays. With graphical and audio effects in VR being essentially perfected, attention is now turning to the other senses (tactile, olfactory and taste), which have yet to achieve this level of immersion. Although some exciting developments have occurred in haptic feedback, such as new gloves and suits, a lot more research is needed on this technology. The olfactory and taste senses, meanwhile, remain at an even earlier stage, with only a few novelty consumer devices of poor quality and limited appeal. However, technology is becoming ever more compact and miniaturised, offering new possibilities for the decades ahead. Computer chips will soon reach the size of individual blood cells, for example, enabling the first laboratory experiments of so-called "full-immersion virtual reality" in animal models. NASA Uranus mission. Although visible to the naked eye, Uranus was never acknowledged as a planet by ancient observers, because of its dimness and slow orbit. With the advent of larger telescopes and new observational techniques, it was finally recognised by William Herschel in 1781. Initially believing he had found a comet or stellar disk, his later studies revealed its true nature and location as a world of our Solar System beyond Saturn. Named after the Greek god of the sky, Uranus was calculated to orbit the Sun once every 84 years (compared with 11.9 for Jupiter and 29.5 for Saturn), at an average distance of 20 AU (3 billion km; 2 billion mi). This made it considerably more remote than its larger siblings, with a much lower temperature and greater concentration of "icy" volatiles such as water, ammonia, and methane. In fact, the lower atmosphere of Uranus was later determined as being one of the coldest places in the Solar System, at 49 K (−224 °C; −372 °F). Another notable feature was its axis of rotation being tilted sideways, placing the equator where most other planets would have their north and south poles. Herschel discovered the first two moons – Titania and Oberon – in 1787. More were identified in 1851 by William Lassell (Ariel and Umbriel), and in 1948 by Gerard Kuiper (Miranda). These five had planetary mass, meaning they could be considered (dwarf) planets if they were in direct orbit about the Sun. The other remaining moons were discovered after 1985. All of these satellites were named after characters from the works of William Shakespeare and Alexander Pope. Rings around the planet were discovered in 1977, intermediate in complexity between the more extensive set around Saturn and the simpler systems around Jupiter and Neptune. Until the late 20th century, the exploration of Uranus was solely through telescopes. This changed in 1986, when NASA's Voyager 2 spacecraft made its closest approach and passed within 81,500 km (50,600 mi) of the cloud tops. As it sped by, it studied the cold atmosphere, discovered 10 moons, and examined Uranus' ring system, discovering two new rings. It also imaged the five largest moons, revealing that their surfaces were covered with impact craters and canyons. Voyager 2 was only a flyby, however, as opposed to an orbiter, so its time at Uranus was relatively brief. Decades passed without further exploration of the ice giant – with attention instead shifting to Jupiter (Galileo and Juno), Saturn (Cassini–Huygens) and Pluto (New Horizons). Ongoing telescopic studies revealed various new moons, bringing the total number to 27. Several mission concepts for Uranus were discussed by NASA and ESA in the late 2000s and early 2010s, including a joint collaboration between the two agencies. However, these were considered to be of lower priority than Mars and Jupiter. By the late 2010s, following the success of these various other missions around the Solar System, new exploration targets were needed, and Uranus was now seen as a more promising candidate. Astronomers had recently discovered that exoplanets weighing about the same as Uranus and Neptune were more common than gas giants like Jupiter and Saturn. Determining the basic composition and interior structures of these ice giants would fill an important gap in the knowledge of how other star systems formed. A close-range study would also reveal more clues about the surprisingly dynamic icy moons, rings, and bizarre magnetic fields; as well as potentially finding new satellites. Furthermore, in the decades following Voyager 2's visit, the orbit of Uranus had changed enough to illuminate more of its equator. This was making its atmosphere less bland and featureless than before, with a number of dramatic storms and swirls becoming visible. In its decadal survey for 2013-2022, NASA had included a return to Uranus or Neptune as one of its longer-term objectives, with Uranus being favoured due to more convenient planetary alignments. In June 2017, NASA released a follow-up analysis – Ice Giants: Pre-Decadal Survey Mission Study Report – as a precursor to the next decadal survey (2023-2032). This comprehensive, 529-page report outlined in more detail the options available for a return to both Uranus and Neptune; most of the concepts proposed detachable probes to be dropped into the ice giants' atmospheres, with the main spacecraft remaining in orbit for two to three years. Because of reduced sunlight in the outer Solar System, it was advised that a nuclear power source be used instead of solar. The science payloads of up to 150 kg included cameras (with spatial resolution of less than 100 metres), magnetometers, spectrometers and doppler imaging for the detection of "seismic waves" around the rocky core and icy mantle layers. For Uranus, the optimal launch window was between 2030 and 2034, with a journey time of around 12 years. However, the development of new and more powerful rockets, such as NASA's heavy lift Space Launch System (SLS), provided the option to cut four years off the transit time. Thus, a mission timeframe of between 2038 and 2046 appeared most likely. Using the SLS would require aerocapture techniques upon arrival – skimming the planet's outer atmosphere as a "braking manoeuvre" to avoid overshooting the target. Despite aiming for a 2030 launch date, NASA experienced the usual delays and budget setbacks that had befallen so many of its earlier missions. Consequently, the Uranus mission was pushed closer to the end of its launch window. However, the SLS had been demonstrated for deep space travel and was now utilised for more rapid transit across the Solar System, offsetting the aforementioned delays. By 2042, the Uranus mission is well underway, revealing a wealth of new scientific data about its atmosphere, interior, rings, and satellites. The planet is globally surveyed with unprecedented detail; its dozens of moons are imaged in stunning resolution, while new moons are discovered too. The Diary of Anne Frank enters the U.S. public domain. The Diary of Anne Frank was a book of the writings by the German-born diarist, Anne Frank, who documented her life in hiding between 1942 and 1944 during the Nazi occupation of the Netherlands in World War II. She and her family were apprehended in 1944, with Anne herself dying of typhus in the Bergen-Belsen concentration camp the following year. Anne's father, Otto, the only survivor of the family, returned to Amsterdam after the war to find that her diary had been saved by one of the helpers, Miep Gies, and his efforts led to its publication in 1947. It became one of the world's most widely known books, with Anne gaining fame posthumously as perhaps the most discussed victim of the Holocaust. It was translated into more than 60 languages and formed the basis for several plays and films. The diary received widespread critical and popular attention on the appearance of its English language version, Anne Frank: The Diary of a Young Girl by Doubleday & Company (USA) and Valentine Mitchell (UK) in 1952. The book later appeared in several lists of the top books of the 20th century. The copyright of the Dutch version, published in 1947, expired in 2016, 70 years after the author's death as a result of a general rule in copyright law of the European Union. Following this, the original Dutch version was made available online. It then appeared on Wikisource, an online digital library of free content operated by the Wikimedia Foundation. However, the diary was soon removed from Wikisource, after concerns over copyright. The Wikimedia Foundation was under the jurisdiction of U.S. law and therefore subject to the Digital Millennium Copyright Act (DMCA) – specifically title 17, chapter 5, section 512 of the United States Code. As the Foundation had noted in 2013: "The location of the servers, incorporation, and headquarters are just three of many factors that establish US jurisdiction ... if infringing content is linked to or embedded in Wikimedia projects, then the Foundation may still be subject to liability for such use—either as a direct or contributory infringer." With 70 years having passed since Anne's death, her diary entered the public domain in the Netherlands on 1st January 2016 (although there was still some dispute about this). In the United States, however, the original text would remain under copyright until 2042. This was due to a combination of several factors. Firstly, works published before 1978, in general, had a copyright of 95 years from date of publication, which in the case of Frank's diary was 1947. Secondly, foreign works of countries that were treaty partners to the United States were covered as if they were U.S. works. Thirdly, even if a country were not a treaty partner under copyright law at the time of a publication, the 1994 Uruguay Round Agreements Act (URAA) restored copyright to works that: had been published in a foreign country; were still under copyright in that country in 1996; and would have had U.S. copyright but for the fact they were published abroad. The City of Trees project is completed in Manchester, England. The City of Trees is a project to plant three million new trees, one for every man, woman and child in Greater Manchester, England. Initiated by charities including the Community Forest Trust, it aims to transform over 20 square kilometres of unmanaged or vacant land into a productive and healthy state. This is achieved with help from public volunteers, community groups and local companies. The trees are planted all over Manchester – in urban areas, streets, parks, and gardens, as well as existing woodlands to expand their habitats and link to neighbouring areas of tree cover, which helps to improve biodiversity. Entirely new forests are also created. The 25-year project is largely completed by the early 2040s. Many of the first trees that were planted back in the early stages of the project have now grown to full maturity. Alongside the uptake of electric vehicles and other measures, the city is now enjoying much improved air quality, as well as better resilience to climate change. The benefits of trees include reduced heat and air conditioning costs (saving up to 10% on energy consumption) and a reduction in flood risk (trees absorb surface water runoff, which would otherwise overload drainage systems, and reduce it by up to 80% compared to asphalt surfaces). The presence of trees can also improve mental health, increase educational performance and cognitive development, and create new jobs in timber and forest management.

2043: The Ross Sea has lost 50% of its summer ice cover. The Ross Sea is a large bay of the Southern Ocean in Antarctica. Like much of the frozen continent, it had been gaining ice at the start of the 21st century. This was due to various factors including changes in wind speed, precipitation, salinity, ocean currents, air, and water temperatures. In subsequent decades, however, a rapid reduction of ice cover began to occur during the summer months as temperatures in the region soared, with corresponding changes in wind patterns and ocean currents. By 2043, half of the summer ice cover has been lost and is continuing to decline, now on track to decrease 56% by 2050 and 78% by 2100. The Ross Sea is critically important in regulating the production of Antarctica's sea ice overall. The decline now being witnessed therefore has long-term implications for the continent as a whole. This comes at a time when commercial interests are beginning to eye the potential for resource extraction, as the Antarctic Treaty is due for review in 2048. Marine life in this productive and once unspoiled ecosystem is also being negatively impacted. A number of important species are dependent on the ice during their life cycles, including crystal krill and Antarctic silverfish. Krill are a major food source for the Ross Sea's top predators – minke whales, crabeater seals, Adélie and Emperor penguins (the latter may go extinct by 2100, if trends continue). Slovenia closes down its only nuclear power plant. The Krško Nuclear Power Plant is located in Krško, Slovenia. It was built between 1975-1983 as a joint venture by Slovenia and Croatia, which were at the time both part of Yugoslavia. With 730 megawatts of generation capacity, it provided more than one-quarter of Slovenia's and 15 percent of Croatia's power. In 2008, a coolant leak was reported, triggering fears of a Chernobyl-style disaster, and prompting an EU-wide alert. However, this turned out to be a false alarm. The incident resulted in a relatively large amount of media attention for what was a minor malfunction. The planned retirement date for the plant was 14th January 2023. The decommissioning plan that was ratified by the Slovenian and Croatian parliaments scheduled the start of disassembly shortly after that, and the taking apart of the plant would last until 2036. An extension for 20 years – extending the plant lifetime to 14th January 2043 – was subsequently made to the Slovenian regulatory body (URSJV). The Chang'e-3 lander is shutting down. Chang'e-3 was a Chinese spacecraft launched in 2013. It was the country's first unmanned probe to touch down on the Moon's surface and included both a lander and rover. The landing site, in Mare Imbrium, was determined by topographic data from the previous Chang'e 1 and 2 orbiters. On the surface, Chang'e-3 deployed a 140 kg (310 lb) rover named Yutu, which was designed to explore an area of 3 square kilometres (1.2 sq mi) during a three-month mission. Equipped with a digging tool, it performed analysis of soil samples. Using a radar on its underside, it also obtained the first direct measurements of the structure and depth of lunar soil down to 30 m (98 ft) and studied the lunar crust at several hundred metres' depth. The lander, meanwhile, featured the first automated and remotely operated telescope to be deployed on another world. The thin exosphere and slow rotation of the Moon allowed for extremely long and uninterrupted views of targets and their light variation, improving a number of astronomical models. These observations included galaxies, active galactic nuclei, variable stars, binaries, novae, quasars, and blazars, as well as the structure and dynamics of the Earth's plasmasphere. Three cameras installed on the lander – each facing in different directions – provided high-res panoramic imagery of the Moon's surface. NASA's Lunar Reconnaissance Orbiter (LRO) took a photo of the landing site, in which the rover and lander were both clearly visible. With solar panels and a radioisotope heater unit (RHU) designed to last 30 years, Chang'e-3 was a very long-running mission. For comparison, NASA's Mars Science Laboratory had a lifespan of 14 years. While the rover stopped working in March 2015, the telescope on the stationary lander had enough power to operate until 2043. By this date, Chang'e-3 is finally shutting down. Like the Apollo landing sites of the late 1960s and early 1970s, it earns a historic status and eventually becomes a tourist attraction when civilians are able to visit the Moon. Given the lack of atmosphere or weather conditions on the surface, it remains in a pristine state into the distant future.

2044: The works of J.R.R. Tolkien enter the public domain. Copyright law has remained largely unchanged over the years. Accordingly, the English writer J.R.R. Tolkien's books and writings – including his famous Lord of the Rings trilogy and The Hobbit – are entered into the public domain on 1st January 2044. The law states that this occurs "at the end of the period of 70 years from the end of the calendar year in which the author dies", which in this case was 1973. A work entering the public domain means that anyone is free to do whatever they want with it. Film and stage adaptations, re-prints, assorted varieties of fanfiction, video games and much more besides, can all be created, for profit, without the approval of the Tolkien Estate. Lord of the Rings had regularly appeared in lists of the most read books of all time. A tipping point for permafrost melting. Scientists had warned that a global average temperature rise of 2°C was the maximum safe limit, beyond which climate change would race out of control. In earlier decades, tipping points were already being reached, such as the Arctic becoming ice-free during summer. By 2044, an even greater and more serious threat is emerging in the form of greenhouse gases from melting permafrost. The quantity and extent of permafrost – defined as soil at or below the freezing point of water 0°C (32°F) for two or more years – had been well-documented. It was known that enough carbon lay trapped in Arctic ice to effectively double the level already in the atmosphere. Vast stores of methane, a greenhouse gas over 25 times more potent than CO2 and 72 times more potent during the first 20 years, were also present, with potential to drastically alter the climate. Studies found that in many places, long-frozen soils were collapsing as they melted, forming erosional holes and landslides that revealed larger areas of permafrost beneath. This was accelerating the process of melting and greenhouse gas release, since these larger areas were then exposed to direct sunlight (pictured below). Alongside this finding, evidence from Siberian caves suggested that a global temperature rise just 1.5°C (2.7°F) above current levels would see permanently frozen ground beginning to thaw over a vast area, representing a major tipping point. In addition to climate destabilisation, disastrous impacts on infrastructure are now being felt in the region with natural gas facilities, roads, railways, and buildings in general all built upon permafrost vulnerable to thawing. Adjoining regions are witnessing significant changes, with Mongolia’s Gobi Desert becoming far wetter. In the decades ahead, this extremely arid area will come to resemble the present-day Asian steppes. With large-scale permafrost melting underway, global temperatures are now on track to reach 3°C (5.4°F) above the 20th century average by the mid-2050s. A genuine catastrophe is looming, threatening billions of lives. The last veterans of WW2 are passing away. During this decade, the last surviving veterans of World War II are passing away. A small number of them reach their 120th year, allowing them to attend the 100th anniversary commemorations of D-Day, on 6th June 2044. On this date, a time capsule is opened at the American Cemetery at Colleville-sur-Mer, close to the site of the Normandy landings which claimed so many lives. This contains press articles from the time – including a message from President Eisenhower to future generations. A transglobal highway and rail network has emerged. Midway through the 2040s, practically all major continents are connected to each other by road, rail, and tunnel. This transglobal highway, as it comes to be known, was never a definite goal nor an individual project. It was instead formed by long-term, incremental steps, as countries around the world reached across their borders and waterways to formerly distant neighbours. Over the years, from the mess of highways and railways criss-crossing the globe, a specific route began to emerge, uniting every populated continent with the exception of Australia. Arguably one of the first steps in this process was the Trans-Siberian Railway, which began construction in 1890. This was a network of railways connecting Moscow with the Russian Far East and Sea of Japan. It became the world's longest railway, stretching 9,300 km (5,800 mi), with expansion continuing more than a century later. Another major transport network to emerge during the 20th century was the Pan-American Highway. This linked the mainland nations of the Americas – extending from Prudhoe Bay in Alaska to the lower reaches of South America. With a total length of 48,000 km (30,000 mi), it became the world's longest "motorable road". However, because of the Darién Gap, it was not possible to cross from Central America to South America by traditional motor vehicle. The Channel Tunnel (an idea first proposed in 1802), was officially opened in 1994. This provided the first direct physical link between Britain and mainland Europe since the Ice Age. This megaproject included the longest undersea portion of any tunnel in the world, at 37.9km (23.5 mi) in length. In 2000, the Øresund Bridge was completed. This became the longest combined road and rail bridge in Europe and connected the major road networks of Scandinavia to those of Central and Western Europe. It was followed by the Fehmarn Belt Fixed Link, providing a more direct route from Germany to Sweden and Norway. By the 2020s, several major new connections were being completed around the world. One notable project was the Sunda Strait Bridge, linking the Indonesian islands of Sumatra and Java. This crossing utilised islands in the strait as steppingstones. It carried a six-lane highway and a double-track railway and cost upwards of $12bn. The most extensive portion – a suspension bridge – had a central span of 3,000 m (9,800 ft), making it the second longest suspension bridge in the world at the time. By 2025, another major link was completed – this time in the Mediterranean. The Strait of Gibraltar Tunnel had been in the planning stages for decades, but in the mid-2010s it finally got off the ground. Negotiations between Spain and Morocco were drawn out, but an agreement over costs and economics was finally reached. Building the tunnel posed serious engineering challenges, mainly because of the deep seabed found in the strait: around 300 metres in the shallowest areas. A car tunnel was ruled out, since removing exhaust fumes from that depth proved infeasible. The engineers behind the project also determined that, in order to graduate the slope, the entrances on either side would need to be almost 3km away from the coast. Despite rising costs, the project continued to move forward, construction technology overcame the structural challenges, and midway through the 2020s, the first trains made the journey from one continent to another. Meanwhile, road and rail projects were growing outwards on both sides, with new routes expanding through Africa and existing routes being strengthened in Western Europe. The 2020s also saw development of a bridge linking Africa with the Middle East. As it became more developed, China greatly expanded its infrastructure, with hundreds of miles of rail track being added each year. Major new transport links began to unify Southeast Asia as a whole. In the 2030s, Japan was connected to the mainland by rail for the first time. This was achieved via the new Sakhalin–Hokkaidō Tunnel, running between the Japanese island of Hokkaidō and the Russian island of Sakhalin. A shorter tunnel connecting Sakhalin to mainland Russia had been completed years before which, coupled with the new 42 km long rail tunnel, opened new trade routes between Japan and continental Asia. Like Gibraltar, many issues had to be overcome, primarily political ones. For years, there was debate as to the economic usefulness of the tunnel, until the demand for such a crossing manifested itself, at which point it was finally agreed that construction would be productive. Thanks to the long-existing Seikan Tunnel, all four of the main Japanese islands were then connected to Russia. Another huge project of this time was the Malacca Strait Bridge. This connected Sumatra (the largest island entirely within Indonesia) to western Malaysia and mainland Asia, with a sea-crossing bridge spanning almost 50 km (31 miles). Combined with the Sunda Strait Bridge (see above), this greatly expanded the Indonesian economy – providing new markets and access to China. By the 2040s, Japan saw completion of its second mainland connection, this time between it and South Korea. This project had by far the most problems of any major link, due to the differing political and cultural climates of the two countries. Trade and travel necessity ultimately pushed the project along, after considerable delays. One of the most ambitious of all the transglobal highway projects – the Bering Strait Crossing – was also completed early in this decade. A joint venture between Russia and the United States, it was the first direct infrastructure link between the two countries. It would consist of three tunnels, connecting the continents via the Diomede Islands. During its development, rising global temperatures actually benefited the project, having made construction easier in the warming Siberian landscape. Coupled with the crossing was a series of road and rail projects branching out through Eastern Russia – eventually joining up with China, Japan and all the way through to Europe. In other words, it was now possible to board a train in London and travel directly to New York City, via Moscow and Yakutsk. Ironically, the last connection of the transglobal highway to be completed was actually the shortest. For decades, the Darién Gap in Panama, a near-impenetrable wall of jungle and brush, had blocked road connections between North and South America. Demand was slow to materialise for the connection, and environmental protection prohibited a land crossing. In the early 2040s, a compromise was reached, as the influx of climate change refugees from drought-stricken South American countries demanded an easy path northward. A bridge was constructed across the Gulf of Urabá between Panama and Columbia, bypassing the Darien Gap. By 2044, people can travel to all corners of the globe using direct road or rail links. However, geopolitical stresses are beginning to slow the expansion of new connections. Resource wars have made the opening and closing of national borders increasingly frequent and unpredictable, at times cutting off sections of the network. Despite this, it continues to be largely operational as a whole, due to its sheer size and extent. In a world wracked by environmental catastrophe, it serves as a lifeline for many countries – since food, water, metals, minerals, and other natural resources can be easily and quickly moved from more prosperous regions. Although the transglobal highway is stagnating in growth, it remains a vital piece of infrastructure for decades to come. Five-year survival rates for bowel cancer are approaching 100%. In the early 21st century, bowel cancer was the third most common cancer in the world. As of 2008, there were over 1.2 million new cases and 608,000 deaths globally each year. It was more common in developed countries, where around 60% of cases were diagnosed. The highest rates were found in Australia, New Zealand, Europe and the US and lowest rates in Africa and South-Central Asia. Risk factors for bowel cancer were known to include older age (50+), male gender, high intake of fat, alcohol or red meat, obesity, smoking and a lack of physical exercise. Approximately 10% of cases were linked to insufficient activity. The risk for alcohol appeared to increase at greater than one drink per day. Treatment for bowel cancer would depend on how advanced a particular case was – but generally included surgery, chemotherapy, and radiation. For those with incurable cases, a range of palliative care treatments were available for reducing pain and improving quality of life. Research into bowel cancer led to a marked improvement in survival rates during the late 20th century, a trend which continued into the 21st. Rapid falls in genome sequencing costs, together with increasingly sophisticated mouse models, provided ever more valuable insights into the nature and mechanisms of the illness. New treatments and earlier diagnoses were made possible using targeted nanoparticles   and eventually nanorobotics for delivering medicines with greater precision and fewer side effects. Five-year survival rates are now approaching 100% in the US, with many other countries not far behind.

2045: Humans are becoming intimately merged with machines. In some fields, the pace of technology has become so fast that humans can no longer comprehend it without augmenting their own intelligence. This is particularly true of computing, nanotechnology, medicine, and neuroscience, all of which have seen exponential progress. A high-end consumer PC of today has an integrated AI with processing power equivalent to over a billion human brains. This machine can think for itself, communicate with its user, and suggest new ideas in ways that surpass even the greatest minds on Earth. Due to the flood of data being exchanged on the Internet and other connected devices, these computer AIs often receive and send literally millions of communications each day. The only way for a user to fully interpret this avalanche of information is to merge their consciousness with that of the machine. A growing segment of society is now turning to on-person hardware to achieve this. The most advanced method involves the use of microscopic, wireless, implantable devices linking neural activity directly to electronic circuitry. These have already been used in medical procedures and niche consumer technologies. The latest versions are capable of marrying AI with human intelligence in ways that combine the best aspects of both. No monitor or projector of any kind is required for the latest generation of computers. The implants can instead produce a virtual image of the screen, augmented in the user's field of vision. This operating system is controlled by their thoughts – and those of the AI – running at speeds vastly greater than a real time physical version would allow. Many individual actions can be performed at once, thanks to robust wireless connections between the nanobots and neurons. If necessary, a user's entire sensory experience can be instantly shifted to full immersion virtual reality. This is a popular choice for gaming and entertainment, but also has many practical applications in the world of business. Meetings and conferences can be hastily scheduled between vast numbers of participants from around the globe – with barely a few seconds’ notice – often lasting only a few seconds in duration. Communicating at this speed is not possible using conventional means, creating a significant divide between those who have the technology and those without. For many people, nanobot implants are becoming permanent and essential – rather than temporary and optional – due to the sheer speed and level of information now being encountered in day-to-day situations together with the explosive growth of AI. Military personnel, scientists and medical staff were among the first to take advantage of them, but mainstream society is now following. People are merging with machines in various other ways, too. Nanobots can boost immune systems, for example, helping to exterminate pathogens. They can also regulate blood pressure, or repair some of the damage caused by the aging process or accelerate the healing of wounds. Cybernetic organs are now available that almost never fail and can filter deadly poisons. Brain-computer interfaces are increasingly used in middle class homes to open doors, control lighting and operate everyday appliances. The most extreme cases of enhancement involve people opting for decentralised circulatory systems, as well as a form of synthetic blood, reducing physical vulnerability still further. This particular option is only available to the wealthy at this stage, as it involves a highly complicated procedure that radically alters their internal anatomy. The end result is that a person can survive multiple gunshot wounds and other internal damage relatively easily. Politicians and a number of famous celebrities are taking advantage of this. It is also popular with gangsters and career criminals. The line between man and machine is starting to blur. Later this century, there will no longer be a clear distinction. Air accident fatalities have been eliminated. Recent decades have seen an explosion in the level of computer control in vehicles of all types, greatly reducing the need for human involvement. This includes the aviation industry, which has become highly automated. Although manned crews may still be present, their roles are limited and supervisory, with only minimal if any need for intervention during the flight process. On the ground, airport infrastructure and navigation systems have seen major overhauls, leading to vastly improved traffic management and safety. Revolutionary new materials (such as graphene) – in combination with self-healing, nano-sensors and other systems embedded throughout the wings and body – have largely eliminated the structural failure issues that plagued earlier generations. Most aircraft now run on purely electric systems, without the need for dangerous and combustible liquid fuels. Hijacking and violent incidents, meanwhile, are now practically impossible, due to the sheer level of surveillance and security in place. Quantum encryption has made hacking of navigation systems difficult if not impossible. Consequently, air crashes involving passenger deaths in large commercial aircraft are now almost unheard of in the news. The age-old phobia of flying will soon become a thing of the past. In addition to safety, the rise of computer intelligence in aircraft design is leading to improvements in overall comfort experienced by travellers. The Chūō Shinkansen high speed maglev route is complete. Tokyo and Osaka are now connected by a direct high-speed maglev route – the Chūō Shinkansen. This megaproject began construction in 2014, at a cost of over 9 trillion yen ($115bn). By 2027, the first trains were running between Tokyo and Nagoya, and by 2045 the route has been extended to Osaka, with trains travelling beneath the Japanese Alps (Akaishi Mountains). The first generation of these vehicles reached 313 mph (505 km/h), but newer and even faster designs are now in use. Gulf Coast cities are being abandoned due to super hurricanes. The rapid growth of CO2 emissions has led to rising sea levels, a warming of coastal waters and a more volatile climate system. In the Gulf of Mexico, a new category of "super hurricane" has emerged. This is becoming a regular occurrence by now. These extreme weather events are nightmarish in scale and intensity. At their peak, winds of 200mph bring untold devastation. Trees are uprooted and hurled like matchsticks, while skyscrapers visibly sway. Storm surges and flash floods travel up rivers with surreal speed, overwhelming defences and bringing waves tens of metres high. Damage from these and various other disasters has run into hundreds of billions of dollars. A number of Gulf coast cities are being permanently abandoned during this time, including Houston and New Orleans. Apollo 12's third stage returns to Earth. In November 1969, the Apollo 12 Saturn V blasted off towards the Moon. This rocket consisted of a three-stage launching system. While the first and second stage dropped back to Earth after launch, the third stage (S-IVB) was used to propel the docked Apollo Command Module and Lunar Module into a lunar trajectory. In 2002, amateur astronomer Bill Yeung discovered what appeared to be a 30-metre (100 ft) asteroid in orbit around the Earth. Initially, it was believed to be a second natural satellite of Earth (the Moon being the first). However, later measurements of the electromagnetic spectrum were consistent with titanium dioxide paint used for the Saturn V rockets. Backtracking its orbit showed that the object had been orbiting the Sun for 31 years and had last been near Earth's vicinity in 1971. This seemed to suggest it was part of Apollo 14, but NASA knew the whereabouts of all hardware used for that mission – the third stage, for instance, was deliberately crashed into the Moon for seismic studies. Another explanation emerged that it could be the S-IVB third stage for Apollo 12. NASA had originally planned to dump this spent component into a solar orbit, but an extra-long burn of the ullage motors meant the remaining propellant could not provide enough energy to escape the Earth–Moon system. Instead, this stage ended up in a highly erratic orbit around the Earth, after passing by the Moon. Sure enough, this antique space junk, with an empty weight of 9,600 kg (20,000 pounds), would return to Earth during the mid-2040s, after drifting for over 75 years. Objects with similar mass tend to impact Earth's surface about once every 10 years. Total solar eclipse in the USA. A total solar eclipse occurs on 12th August 2045; the fourth longest of the 21st century. It is visible throughout much of the continental United States, with a path of totality running through California, Nevada, Utah, Colorado, Kansas, Oklahoma, Arkansas, Mississippi, Alabama, and Florida. The eclipse is greatest over the Bahamas, before continuing over the Virgin Islands, Hispaniola, Venezuela, Guyana, Suriname, French Guiana, and Brazil. The path of totality is witnessed over many major cities – including Reno, Salt Lake City, Colorado Springs, Oklahoma City, Tulsa, Fort Lauderdale, Miami, Orlando, St. Petersburg, Tampa, Nassau, and Santo Domingo. It lasts for at least six minutes along the part of the path that starts at Camden, Alabama, crossing Florida and ending near the southernmost Bahama Islands. The longest duration of totality is six minutes and 5.5 seconds, located at the coordinates 25°54.594'N 78°32.19'W, which is over the Atlantic Ocean east of Fort Lauderdale and south of Freeport, Bahamas. A solar eclipse with a very similar path occurs in 2017. The Boeing B-52 Stratofortress is retired from service. The Boeing B-52 Stratofortress was an American long-range, subsonic, jet-powered strategic bomber. Designed and built by Boeing, it was operated by the United States Air Force (USAF), with official introduction in 1955. Capable of holding up to 32,000 kg (70,000 lb) of weapons, it had a typical combat range of more than 8,800 miles (14,080 km) without aerial refuelling. A total of 744 of the aircraft were built, with the last one delivered in 1962. The strength, reliability and versatility of the B-52 allowed it to become the backbone of the U.S. Air Force and a veteran of numerous wars. It played a key role during the Cold War – carrying nuclear weapons for deterrence and standing on alert, ready to conduct "doomsday missions" if necessary, though it only ever dropped conventional munitions in combat. During the 1960s, it performed large numbers of conventional strikes in Southeast Asia, pounding enemies with massive firepower. While some variants were retired, the B-52 continued to see active service for many more decades, undergoing a series of upgrades and modernisations with new avionics, equipment, and weapons. B-52 strikes were an important part of Operation Desert Storm (1991) and Desert Strike (1996), the latter including a 34-hour, 16,000-mile round trip from Guam – the longest distance ever flown for a combat mission. B-52s also contributed to Operation Enduring Freedom in Afghanistan/Southwest Asia (2001) and Operation Iraqi Freedom (2003). Superior performance at high subsonic speeds and relatively low operating costs continued to keep the B-52 in service, despite the introduction of more advanced aircraft. As of 2015, 58 were in active service with 18 in reserve. After being upgraded again, it would serve until 2045 before finally reaching the end of its life, an unprecedented 90 years after introduction. The B-52 is replaced by the Northrop Grumman B-21 Raider. Major extinctions of animal and plant life. By the end of this decade, many well-known animal species are going extinct, or have declined in such numbers that only those in captivity are now remaining. Off the eastern coast of Australia one of the world's greatest natural wonders – the Great Barrier Reef – has been almost completely destroyed, with less than 2% of coral remaining. Rising levels of greenhouse gases have made the water too acidic for calcium-based organisms to grow. Dumping of dredged sediment to help create the world's largest coal port has caused further damage. Most of the colourful fish for which the reef is famous have also disappeared. On land, 50% of the continent's 400 butterfly species have died out, as well as numerous reptiles including Boyd's forest dragon, a rare and colourful lizard. In Europe, an astonishing 50% of amphibians have disappeared due to pollution, disease and loss of habitat including many previously common species of frogs, toads, salamanders, newts, and caecilians. More than 20% of bird species have been lost, and around 15% of plants. In South Africa's Kruger national park, a major conservation area, nearly 60% of the species under its protection have been lost. In the same region, 35% of Proteaceae flowering plants have disappeared including the country's national flower, the King Protea. In South America, nearly half of the Amazon rainforest has been destroyed, with more than 2,000 native tree species becoming extinct. In Mexico, nearly 30% of animal species are either extinct, or critically endangered. In Southeast Asia, the Indian elephant is on the brink of extinction. Once a common sight in this part of the world, it has declined in huge numbers due to poaching for the ivory of its tusks, loss of habitat, and human conflict. In the Arctic, nearly 70% of polar bears have disappeared due to the shrinking of summer ice caused by global warming. By 2080 they will disappear from Greenland entirely, and from the northern Canadian coast, leaving only dwindling numbers in the interior Arctic Archipelago. Many other well-known species of fish, bird and mammal become critically endangered around this time. This period is often referred to as the Holocene extinction event. As a direct result of human influences, the rate of species extinctions this century is between 100 and 1000 times the natural "background" or average extinction rates in the evolutionary time scale of Earth.

2046: Japan's population falls below 100 million. For most of the 20th century, Japan experienced high population growth. The post-war "miracle" saw its booming economy catapulted into second place behind only the USA by the 1980s. This period of rapid expansion ended with the bursting of the asset price bubble in 1991 and was followed by the "Lost Decade", which persisted into the 21st century. Japan experienced stagnation both financially and demographically. By 2010, the country's population had peaked at 128 million. From this point onward it would undergo a long, slow decline. This net loss was caused by falling birth rates and almost no net immigration, despite the highest life expectancy in the world at 84.6 years of age. The rising costs of childbirth and child-raising, later average age of marriage, the increasing number of unmarried people and greater numbers of women in the workforce were all contributing factors. Japan's "total fertility rate" (defined as the average number of children born to a woman over her lifetime) was now 1.43, far below the 2.07 replacement-level fertility needed to sustain a country in the long term. In some isolated rural areas, this trend was so damaging that entire communities were already disappearing. The shrinking segment of productive workers aged 15 to 64 began to seriously impact on Japan's GDP, affecting pensions and social welfare in particular. As the overall standard of living fell, calls were made for radical policies, but the low birth rate could not be fundamentally reversed. By the 2040s the population was falling by over a million each year. By 2046, it has dropped below 100 million, a level last seen in the 1960s. This trend will continue for the rest of the century, with Japan struggling to achieve a "managed decline" as it deals with the inevitability of its situation and its waning influence on the world stage. Among the solutions being introduced is greater use of robots to keep society running. Japan's love for and experience of robotics (perhaps more than any other country) and its already well-established industry is helpful in this regard, mitigating some of the impacts it would otherwise have felt. The UK state pension age has risen to 70. Due to increases in life expectancy, and unsustainable levels of age-related spending, the UK's state pension age has risen from 65 in 2012, to 70 now. This provides a net benefit to the public finances of around 0.7% of GDP. Angola is declared free of landmines. For most of the late 20th century, conflict ravaged the west African nation of Angola. The War of Independence (1961–1974) resulted in tens of thousands of civilian and military deaths, before the granting of independence from Portugal in 1975. However, the Angolan Civil War immediately followed and would last from 1975 until 2002, with estimates of 800,000 killed and four million displaced. During this time, nearly 70,000 people became amputees as a result of landmines placed all over the country. The mines continued to litter rural areas, long after the ending of the war, contributing to ongoing civilian casualties. Angola ranked among the most heavily mined countries in Africa and the world, with as many as 15 million devices having been placed by militant forces. Dozens of people continued to be killed or seriously injured each year, while vast areas of countryside remained off-limits. Diana, Princess of Wales, had raised awareness of the landmines during a visit to Angola shortly before her death in 1997. She was influential in the signing of the Ottawa Treaty, which created an international ban on the use of anti-personnel landmines that same year. Demining of Angola continued in the early 21st century. However, the pace of removal proved to be slow, due to lack of government funding, and despite contributions from Britain and other foreign nations. Even as late as 2020, more than 1,200 minefields remained active, covering a total area of more than 100,000,000 square metres. Based on the rate of demining, it was estimated to take until the mid-2040s for complete clearance. By 2046, this landmark is finally reached, with Angola declared free of landmines. Safe access to land and resources – such as fields, forests, water sources, and metal/mineral deposits – has now greatly improved the country's social and economic prospects, in a way similar to Mozambique, which gained as much as 25% in GDP per capita as a result of mine clearance. Carbon pricing is ubiquitous worldwide. By the mid-2040s, carbon pricing initiatives have been implemented by practically every government around the world, via policies such as carbon taxes and/or a requirement to buy permits to emit, generally known as cap-and-trade. This has finally solved the economic problem of CO2 and other known greenhouse gases being negative externalities – detrimental products lacking any market mechanism responsive to the costs of their emission. In the 1990s, only a handful of countries had these measures in place, all of them in Europe. The number grew steadily in the early 21st century amid mounting concern over environmental impacts. International agreements, such as the Paris climate accord of 2016, gave added impetus. There were, of course, a number of setbacks along the way. In Australia, for example, the Labour Party led by Julia Gillard implemented a carbon tax from 2012-2014 (priced at $23-24/tonne). This was later revoked by the Liberal Party, led by Tony Abbott, who insisted that coal was "good for humanity" and would be the "world's main energy source for decades to come." In the USA, meanwhile, President Trump announced his intention to withdraw from the Paris Agreement and introduced a slew of policies aimed at rolling back regulations. Momentum continued to build, however, as the impacts of climate change became more and more obvious. Alongside this, a demographic shift was undermining the Baby Boomer generation, who had supported many traditional conservative policies. Their influence was waning in favour of Generation X and the Millennials, who preferred stronger action on the environment. Not only were national governments taking steps, but smaller regions and individual cities were introducing policies too. As the years rolled by, public opinion showed increasing support for carbon taxes and emissions trading. Studies were finding that a price on carbon – combined with tax cuts for businesses, and rebates to low-income families most affected by the scheme(s) – could be an effective and bipartisan way to curb emissions of greenhouse gases. By the early 2020s, China had introduced a national cap-and-trade scheme for its power sector, boosting the share of global emissions with a pricing system in place from 15% to around 20%. As progress in carbon (and other greenhouse gas) pricing continued, a substantial economic shock lay on the horizon. The installed capacity of renewables like solar and wind power had increased exponentially, thanks to dramatic improvements in cost and efficiency, alongside major advances in batteries, smart grids, and other low carbon technology. When combined with a rapid uptake of electric vehicles (and the outright phasing out of traditional internal combustion engines by many countries), fossil fuel companies now had trillions of dollars’ worth of stranded assets. A sudden drop in the value of fossil fuels – and the bursting of this "carbon bubble" – became apparent in the early 2030s, by which time, around 50% of worldwide greenhouse gas emissions had a pricing mechanism in place. While many investors had recognised the scale, speed and inevitability of this energy transition and taken steps to limit their exposure, a significant amount of economic disruption nevertheless occurred. The losses, centred on the U.S. and Canada, were greater than those seen during the financial crisis of 2008. By the 2040s, the world has largely recovered from this earlier shock – although new economic risks and uncertainties abound, with technology and society now changing faster than ever before, alongside a frantic race to ameliorate the effects of climate change.

2047: Fully autonomous, intelligent military aircraft. Most of today's jet fighters are now entirely computer controlled. These unmanned planes have fully autonomous capability from the moment they take off, to the moment they land. A combination of strong AI, swarming behaviour and hypersonic technology is employed to create near-instantaneous effects throughout the battlespace. The growth of network-centric warfare and the increasing complexity of enemy types, movements and environmental factors has led to major advances in target recognition technology and collision avoidance systems. This allows whole squadrons of pilotless planes to synchronise, attack from multiple vectors and regroup in seconds. Further autonomy is provided by auto air refuelling, self-repair, and other systems. With the emergence of AI, personnel costs have shifted from operations, maintenance, and training to design and development. Machines can perform repairs in-flight (including the use of "self-healing" nanotechnology composites) while routine ground maintenance requires little or no human labour, being done mostly by robots. New tactics and information can simply be programmed into the aircraft, or they can "learn" from others in the swarm. With their hypersonic engines, inhuman reaction times and improved weaponry, these craft would run rings around human pilots of earlier decades. Following the retirement of the F-35 Lightning II, manned fighter planes are now essentially obsolete. Unmanned probe to 2060 Chiron. 2060 Chiron is a minor planet found in the outer Solar System. Discovered in 1977, it became the first in a new class of objects known as centaurs – small bodies orbiting between Saturn and Uranus – and named after the race of half-man/half-horses from Greek mythology, in recognition of their dual comet/asteroid nature. 2060 Chiron has a radius of around 145 miles (233 km) and a parabolic orbit going from just inside the orbit of Saturn (8.5 AU) to just outside the orbit of Uranus (19 AU). With an orbital period of 51 years, it reaches perihelion in 2047. As part of the recent exploration of the Kuiper Belt, an unmanned robotic probe has been sent to intercept it, arriving this year. The mission returns vital data about Chiron's size, shape, polar obliquity, atmosphere, surface morphology, composition, internal structure, surface activity (including the nature of Chiron's outbursts), and its origin. NASA had been planning for such a mission as far back as 1994.

2048: The Antarctic Treaty comes up for review. Antarctica is the last remaining unspoilt wilderness, untouched by the massive industrialisation common everywhere else on the planet. It covers an area of 13.7 million sq km (5.3 million sq miles) and is covered by an ice sheet 4 km (2.5 miles) deep. It has no human inhabitants, other than a small number of scientists in research stations. The vast, icy continent is governed by the terms of the Antarctic Treaty, which came into effect in 1961. This was signed by Argentina, Australia, Chile, France, New Zealand, Norway, the UK, Belgium, Japan, South Africa, the USA, and Russia. The first seven of these countries have historic claims to the continent (none of which are generally recognised) and the Treaty preserves the status quo, neither recognising nor repudiating the old claims, but forbidding their expansion in any way. The terms of the Treaty also forbid the assertion of new claims. The discovery of a hole in the ozone layer, and other concerns, led to the addition of a new environmental protocol agreed in 1991. This entered force in 1998. It was intended to protect Antarctica's environment and ecosystems and included a total ban on the exploitation of mineral and energy resources, as well as strict regulation of pollution and other damaging activities. The protocol is open for review in 2048, exactly 50 years after it was implemented. Much has changed in the last half century. Earth's population is over 50% larger, placing a substantial drain on the Earth's resources which has become alarmingly obvious by now. Metal and mineral supplies continue to be an issue, even with large-scale recycling systems in place. Despite objections from environmentalists, there is general consensus among the international community that some limited exploitation of Antarctica should be permitted within certain specially controlled areas. Over the next few years, a new treaty is drawn up with modified clauses, though disputes continue over territorial boundaries. There are significant logistical challenges to mining and mineral extraction in the region – such as the isolation, extreme cold, rough seas, and thick ice sheet. However, new technologies look set to mitigate these problems, including the use of robots, heavy automation, and alternative methods of drilling. In addition, climate change and the melting of ice is making it possible to exploit some previously inaccessible areas of the western ice sheet. King crabs are infesting Antarctic marine ecosystems. In addition to humanity's increased presence resulting from the Antarctica Treaty review described above, certain animal species have also begun to move into and exploit the continent due to changing environmental conditions. King crabs are one such species. For tens of millions of years, icy cold waters had excluded shell-crushing fish and crustaceans from the continental shelf surrounding Antarctica. Rapid warming has now allowed predatory crustaceans to return. From the 1950s to the 2010s, the west coast of the Antarctic Peninsula witnessed a 3.2°C temperature rise – several times the global average. This allowed king crabs to move up the outer shelf, into shallower waters just 200 metres in depth. By the middle of the 21st century, they are infesting many coastal areas along the Western Antarctic Peninsula. This is causing severe disruption to the food chain, with catastrophic consequences for unique seafloor communities in the region. Bionic eyes are perfected. By the late 2040s, artificial retinas and other visual prostheses can match the visual quality of a real human eye. The first generation of these devices had emerged in the early 2000s, with bulky external equipment and only a very small number of pixels. Subsequent versions featured much higher resolutions, providing ever-more capabilities to improve the living standard of patients with partial or total blindness. Further miniaturisation of components in the 2020s and 30s, alongside new materials and battery/power improvements, resulted in smoother and more colourful visual qualities – as opposed to the monotone and blocky view of the world in the previous generation of devices. This progress would continue into the 2040s, culminating in electrode arrays with hundreds of megapixels and wireless operation. Myopia and other eye conditions were a growing problem in the early 21st century. Half of the world's population faced living with short sight, with up to one-fifth of these people at a significantly increased risk of blindness. Gene therapy proved to be vital in addressing this global issue, but visual prostheses became an increasingly important addition to the range of treatments available. By 2048, the implants not only provide a cure, but also a means of further enhancing the day-to-day experience of users, with such features as embedded cameras, Wi-Fi, augmented reality, infrared, zoom capability and so on. This is now tempting even healthy adults to upgrade their sight in one or both eyes by taking advantage of the improvements offered by computerised vision. These types of devices had already been available some years earlier to the military and certain other specialised or high-profile citizens but are now inexpensive enough to have reached the mainstream. There are issues related to the technology – chiefly in regard to privacy and security, which helps in exposing a number of government and corporate scandals.

2049: Completion of China's Belt and Road Initiative. Chinese President Xi Jinping first proposed the Belt and Road Initiative (BRI) in 2013. Its purpose was to strengthen trade, infrastructure, and investment links between China and more than 100 other countries spanning Eurasia, Africa, and Oceania. One of the largest efforts of its kind in history, the BRI would comprise thousands of projects aiming to improve the lives of 4.4 billion people or about 60% of the global population, equal to 40% of the world's gross domestic product (GDP). The Chinese government called the initiative "a bid to enhance regional connectivity and embrace a brighter future". However, some observers viewed it as a push for Chinese dominance in global affairs leading to a China-centred trading network. Having a targeted completion date of 2049, it would coincide with the 100th anniversary of the People's Republic of China. The BRI would have two main components – the Silk Road Economic Belt (SRB) and the Maritime Silk Road (MSR). The former was planned as a series of interconnecting infrastructure corridors passing through countries on the original Silk Road (established in the Han Dynasty) over Central Asia, West Asia, the Middle East, and Europe. The Maritime Silk Road, by contrast, would consist of sea-based routes over several contiguous bodies of water: the South China Sea, South Pacific Ocean, and wider Indian Ocean area. Branching outwards from the main backbone of the SRB would be six subdivisions, forming links between major regions and coastlines. Like the tributaries of a major river, these consisted of a New Eurasian Land Bridge; a China–Mongolia–Russia Corridor; a China–Central Asia–West Asia Corridor; a China–Indochina Peninsula Corridor; a Bangladesh–China–India–Myanmar (BCIM) Corridor; and a China–Pakistan Corridor. In addition to the SRB and MSR, Russia and China also worked jointly to build an "Ice Silk Road" along the Northern Sea Route in the Arctic, to facilitate increased resource extraction and cargo deliveries, as well as tourism and scientific opportunities. China was already a world leader in infrastructure investment during the late 20th and early 21st century. In contrast with a general underinvestment in transport across much of the industrialised world, China had pursued major infrastructure-based development, resulting in engineering and construction expertise and a wide range of projects. The Belt and Road initiative was therefore a natural extension and continuation of this long-term strategy. Five major priorities were defined for the BRI as a whole: Policy coordination, Infrastructure connectivity, Unimpeded trade, financial integration, Connecting people. Over a period of 35 years, China would make several trillion dollars' worth of investments – in ports, airports, roads, railways (including high-speed routes), bridges and tunnels, as well as power plants and telecommunications networks. One of the more notable examples included a clean energy "super grid", consisting of ultra-high voltage electricity networks linking China and much of the Asian continent. However, while improving the flow of goods, services, people, and capital in both developed and developing regions, the BRI programme was also criticised for being a debt burden on poorer countries; a form of neo-colonialism, which could leave them vulnerable to China's influence. Furthermore, a significant number of the infrastructure projects had proved to be "white elephants", lying empty and unused for years. The resulting bailouts placed the International Monetary Fund (IMF) under increasing pressure, torn between the competing interests of Washington and Beijing. Despite these issues, the Belt and Road Initiative proved to be a success, overall – adding significantly to worldwide GDP and further cementing China's place as the leading global power of the mid-21st century. The Dead Sea is drying up. The Dead Sea is a unique geological feature. Located between Israel and Jordan, it is the lowest point on Earth. With an extremely high content of mineral salts (20%), over six times greater than any ocean, it is completely devoid of life, except for extremophile bacteria. The salts are so concentrated that swimmer can float like corks, without using a life vest. The water of the sea is also purported to relieve pain and treat several different skin conditions and arthritis. For these reasons it has been a world-famous tourist attraction. By 2049, however, the sea has almost vanished. Its main supply of water – the River Jordan to the north – has seen extensive diversions for industry, agriculture, and domestic use. This has reduced its flow to just a trickle by the time it reaches the Dead Sea, far from adequate to replace water lost by evaporation. For decades, the Dead Sea had plummeted in depth, compounded by rising global temperatures accelerating the evaporation of water, and the growing population in the region. By 2049, little more than a pond remains. This is despite efforts to divert water from the Mediterranean and nearby Red Sea. The effects of heat stress on labour capacity have doubled. In the early years of the 21st century, peak summer months of heat stress were cutting human labour capacity to around 90 percent of its full potential. By the middle of the century, this figure has dropped to 80 percent. Rising global temperatures are now having a major impact on those who work outside, or in hot environments – particularly in mid-latitude and tropical regions like South and East Asia, North America, and Australia. This trend has occurred despite heavy reductions in man-made CO2 emissions. Although robots are now handling many human roles, it nevertheless remains a serious issue for the economy and society in general. Heat stroke, heat exhaustion, heat cramps and other conditions are increasing the risk of accidents and injuries. Many sports and leisure activities are being abandoned due to excess heat and humidity, with people forced to spend more and more time indoors. This has boosted the appeal of virtual reality to replace the physical world. The Boeing 747 is retired from service. In the 1960s, US airline Pan Am sought to introduce a new and much larger aircraft than any previously operated. Foreseeing the enormous, long-term growth potential of air travel, the company wanted a jet 2½ times the size of the 707 series, to lower seat costs by 30% and to democratise aviation. Aerospace company Boeing started work on the 747, the first twin aisle airliner, with Pan Am ordering 25 planes. A first flight occurred in February 1969, and it entered service with Pan Am in January 1970; the first plane dubbed a "Jumbo Jet". Development of the 747 continued over the decades, leading to new variants with even larger sizes, more powerful engines, longer flight ranges and greater seating capacities. The final 747–8 featured a wingspan of 68.4 m (224 ft), length of 76.25 m (250 ft), a range of 14,320 km (7,730 nautical miles), and seats for 467 passengers in a typical three-class configuration. With its huge dimensions, this giant of the skies became an icon of the late 20th century. In addition to being a civilian aircraft, the 747 served as the basis for government and military variants, and experimental testbeds like the Shuttle Carrier. Annual orders peaked in 1990 with 122 new planes. The 747 remained the world's largest passenger aircraft for 37 years, until the Airbus A380 superjumbo, which had its maiden commercial flight with Singapore Airlines in 2007. However, major changes began to occur in the industry, which became increasingly apparent during the 21st century. Airline operators gradually shifted their preference away from very large aircraft (VLA) and towards smaller, more fuel-efficient planes with more flexibility and lower upfront costs. Delta Air Lines retired the last U.S. passenger Boeing 747 in December 2017, after the plane had flown for every major carrier since 1970. While more than 450 remained in service globally by the end of the decade – with Atlas Airlines and British Airways being the largest operators – a growing proportion of 747 orders had gone to freighter, as opposed to passenger variants. In July 2020, media reports revealed that Boeing intended to end production of the 747 series, after delivery of its 16 outstanding orders. Production of the final aircraft would occur in 2022. While no longer a mainstay of passenger flights, it remained popular for delivering cargo, thanks to its huge capacity. The 747 series had been able to endure about 35,000 pressurisation cycles and flights – up to 165,000 flight hours – before metal fatigue. Given its lifetime of 27 years, this would mean service ending in the late 2040s. It joins the B-52 Stratofortress, another massive Boeing aircraft, also retiring this decade. New and extremely hi-tech aircraft are now replacing the older generations, with major advances in travel comfort, further improvements in efficiency and a great reduction in journey times. The Fukushima disaster is cleaned up. The Fukushima Daiichi nuclear disaster was a catastrophic failure at a Japanese nuclear power station on 11th March 2011, resulting in a meltdown of three of the plant's six nuclear reactors. It occurred when the plant was struck by a tsunami triggered by the Tōhoku earthquake. Substantial amounts of radioactive materials began to leak, creating the largest nuclear incident since the Chernobyl disaster in April 1986 and the second (after Chernobyl) to measure Level 7 on the International Nuclear Event Scale. Although no short-term radiation exposure fatalities were reported, some 300,000 people were forced to evacuate the area, nearly 16,000 died in the earthquake and tsunami, and 1,600 deaths resulted from evacuation conditions, such as living in temporary housing and hospital closures. The disaster prompted some countries to re-evaluate their policies on nuclear energy. Germany and Switzerland, for example, abandoned this technology altogether, shutting down their last remaining plants in 2022 and 2034, respectively. In the wake of the plant meltdowns, a huge clean up and decommissioning process was initiated. This presented enormous challenges, with massive amounts of radioactive water spilling out – some finding its way into the Pacific (along with debris) and even reaching as far as the U.S. West Coast. Methods used to contain this water included a frozen underground barrier, with coolant fed into pipes at -30°C (-22°F). Robots were required in some parts of the facility, as radiation levels were often too high for humans. In 2014, it was estimated that sealing every reactor off would cost tens of billions of dollars, taking 30-40 years. Sure enough, by the end of the 2040s, this operation is finally reaching its conclusion.

2050: Humanity is at a crossroads. The world of 2050 is a world of contrasts and paradoxes. On the one hand, science and technology have continued to advance in response to emerging crises, challenges, and opportunities. This has created radical transformations in genetics, nanotechnology, biotechnology, and related fields. On the other hand, many of these same technologies have been so disruptive that it has led to a more frightening, unpredictable, and chaotic world than ever before. Humanity is now at a crossroads that will determine its future path for centuries to come – survival or destruction, prosperity, or collapse. Some of the most cherished political, economic, and social structures have been turned on their heads. In a sense, capitalism remains the dominant economic model, but is now evolving drastically in response to ecological impacts, resource scarcity, demographic trends, technology, and a host of other factors. The endless consumer culture that was prevalent throughout the first world has all but collapsed, replaced by a societal need to conserve. Though there are still many wealthy people around, money is concentrated in a shrinking upper class. By 2050, "traditional" free market capitalism is largely viewed as a broken system. As more and more wealth trickles upwards to the hyper-rich elite, there is a growing consensus that money itself – the profit motive – is a major obstacle to future progress, and a new driving force may be required for civilisation to flourish. Debates are raging on what reforms to make in order to adapt societies to this rapidly changing world. People everywhere sense that a great transition is approaching, the likes of which has never been seen before in all of human history. It is clear that some new global paradigm will appear; but it is still unclear what this will be. Decades of stagflation have produced a fragmented, chaotic, and perpetually sluggish global economy. Nearly half of the world's nations have "junk" credit ratings, effectively making them bankrupt. US national debt has now reached almost 400% of GDP, far exceedingly even the levels seen during World War II. China and India, though surpassing the US in overall GDP, have also stagnated. In the face of economic catastrophe, international politics has faced enormous challenges. Although the number of democratic countries has risen significantly over the years, many have turned inward, cutting off foreign relations. Revolutions, wars, and failed states have produced a strikingly different geopolitical map than seen at the beginning of the century. To repair and maintain the fabric of society, an increasing number of regions have abandoned their national currencies in favour of interest-free, non-fiat, non-inflationary local ones. Decentralised cash systems such as the Bitcoin and other electronic alternatives have also exploded in use. Social systems are under extraordinary stress today. The younger generations are increasingly resentful towards the elderly – seeing them as the cause of many problems, and a drain on capital as the ratio of workers to seniors continues to fall. The rich and poor have continued to grow apart, now that upward social mobility has become next to impossible. Massive protests outside corporate HQs and gated communities are a daily reality on the news. Global warming has created almost 150 million climate refugees: a sixfold increase compared to 2010. The influx of people to foreign lands has put a further strain on economies. Resentment towards migrants has produced an upsurge in nationalism with many isolationist parties sweeping government elections. To maintain order and stability, martial law and military occupation is a feature of many cities around the globe. Radical new political parties and movements have emerged, advocating the overthrow of the reigning system. Recycling and waste management – for decades neglected by many countries – are among the issues now taking centre stage. New regulations and market pressures have forced corporations to move away from the model of planned obsolescence   and mass production, to one of conservation and responsibility. Most firms no longer sell entirely new models of their products when technological advances are made. Instead, replacement components and upgrades form the bulk of profits, with items made of universally interchangeable parts. In a world of increasing resource conflicts, "doing more with less" has become an essential mantra. A system is also employed whereby customers return products at the end of their life cycle, to be used as materials for the next generation. In some of the worst-hit countries, mandatory resource dumps are organised, in which citizens are obligated to recycle any unnecessary possessions. Naturally, such systems are highly controversial and intrusive. Meanwhile, the widespread use of robots, automation, 3D printing and other technology has rendered obsolete many traditional human roles. Though industries have made vast improvements in speed and efficiency, it has come at the expense of a declining labour force. Consequently, overall government revenues have seen a net reduction. Radical Islam and its resentment of the West continue to produce new Jihadists. In addition, underground groups ranging from those angry at the first world's neglect, to anarcho-primitivists, have sprung up. By 2050, at least one terrorist nuclear attack on a major world city has been conducted by one of these groups. Large amounts of nuclear material had been missing from Russia since the 1990s and some inevitably fell into the wrong hands. Being orders of magnitude greater than 9/11, the effects of this attack leave a deep psychological scar on many people alive today, fuelling much paranoia and suspicion between nations. Despite this turmoil, progress has been achieved in cooperating on certain key issues, such as global warming. Carbon emissions have fallen substantially compared with 1990 levels, thanks to a global carbon tax and the widespread deployment of solar, wind and wave power, together with 4th generation nuclear. Fossil fuel reserves were declining in any case. Fusion power is also becoming available now and is being adopted by some of the leading nations. Orbital solar is another emerging industry. Energy efficiency and conservation have provided further reductions in CO2 output. However, carbon emissions from earlier decades remain locked into the system. This delayed reaction will continue to affect weather patterns and climate stability, as will the ongoing destruction of the Earth's rainforests, some of which are transitioning from carbon sinks to carbon sources. Sea levels have risen over a foot by now and are beginning to affect much of the world's coastal real estate. Large-scale carbon capture and sequestration   appears to be humanity's last and only hope of reversing these trends. Nearly half of the Amazon rainforest has been deforested. Lack of enforcement in the so-called protected areas has resulted in the Amazon undergoing a catastrophic decline. Though army troops were sent into regions of illegal deforestation, their numbers were simply too small, and the Amazon too vast, to have sufficient impact. Political corruption also played a role in undermining protection efforts. Droughts caused by global warming have further contributed to the decline, with many areas of jungle being turned into parched scrubland. By 2050, nearly 2.7 million sq km have been deforested. As a result, over 30 billion tons of carbon have been added to the atmosphere. Although clean energy sources are offsetting this, they can't save the countless species of plant and animal life dependent on the rainforest for survival. Substantial amounts of biodiversity have been lost. Desperate efforts are being made by non-profit organisations to obtain DNA samples, in the hope of resurrecting these species at some point in the future. Wildfires have tripled in some regions. Rising global temperatures are creating drier conditions for vegetation – producing larger and more frequent wildfires. In North America, the geographic area typically burned has increased by an average of 50%. Worst hit are the forests of the Pacific Northwest and the Rocky Mountains, which have seen a tripling of areas affected. With so much extra burning, air quality and visibility in the western United States is being significantly altered. There has been a 40% rise in organic carbon aerosols and other smoke particles. These irritate the lungs but are especially dangerous to people who have trouble breathing as a result of asthma and other chronic conditions. Southern Europe is also badly affected – especially Greece, which has been ravaged in recent decades. These wildfires are triggering positive feedback loops. As more and more carbon is liberated from burning material and released into the atmosphere, this is further accelerating the pace of global warming. Traditional wine industries have been severely altered by climate change. By 2050, many of the world's most famous wine-producing areas have been rendered unsuitable for traditional grape growing and winemaking, with climate change having severely impacted land use, agricultural production, and species ranges. The area suitable for wine production has declined by almost 85 per cent in some regions. California, Mexico, the eastern USA, Southern Europe, South Africa, and Australia are particularly affected. In response to the crisis, many traditional vineyards have shifted to higher elevations with cooler conditions – putting pressure on upland ecosystems, as water and vegetation are converted for human use. Others have made use of genetic engineering, or indoor growing methods such as vertical farming. Geoengineering efforts are also getting underway but have yet to be successful on a global basis. Although many regions have been devastated, others have actually benefited. This is particularly noticeable in the Rocky Mountains near the Canadian-US border, the westernmost parts of Russia, and Europe which has seen a massive shift northward in the areas suitable. Fish body size has declined by nearly a quarter. By far the greatest impact from global warming has been in the seas and oceans, where changes in heat content, oxygen levels and other biogeochemical properties have devastated marine ecosystems. Globally, the average body size of fish has declined by up to 24 per cent compared with 2000. About half of this shrinkage has come from changes in distribution and abundance, the remainder from changes in physiology. The tropics have been the worst affected regions. Hi-tech, intelligent buildings are revolutionising the urban landscape. In the first half of the 21st century, a soaring urban population posed serious problems for the environment, health, and infrastructure of many cities. In newly industrialised nations especially, urban centres became polluted, overcrowded and chronically inefficient. Throughout the world, metropolitan areas grew to unprecedented sizes – putting huge and increasing pressure on city planners to adapt. Amid worsening climate change and resource depletion, urban regions were forced to either evolve, or die off. Countless cities failed to make this transition in time, and went the way of Detroit, many being abandoned and left to decay, or subject to intense military control and martial law. In those that survived, a new generation of buildings and infrastructure emerged based on these rapidly changing social and environmental needs. Among the most important trends in modern architecture has been self-sufficiency. By 2050, environmental and resource degradation have become so obvious and huge, it has triggered a radical rethink of production and consumption by citizens. As such, many modern skyscrapers now come complete with the internalised creation of food, water, and other resources. Farms often comprise multiple floors of a tower – regardless of its purpose – while rain, mist and condensation are constantly trapped and stored. Advanced 3D printers are available locally on site to manufacture everything from household furniture to personal transportation, to replacement parts for the building itself. Energy is typically provided by photovoltaics and wind turbines. These are often integrated seamlessly into the building design, so as not to harm the aesthetic appeal. Solar power, for instance, can be collected by windowpanes or special photovoltaic paints applied to outside surfaces. The efficiencies for solar have been improving steadily for decades. Nature features heavily in these structures. Many towers incorporate parks and sky gardens, helping to increase the overall biodiversity of a city, with numerous bird and small animal species finding homes and nesting places. Careful environmental controls ensure that these creatures are protected while not becoming a nuisance for human residents. The outside of buildings is often covered with vegetation, or special membranes, designed to filter pollutants and capture CO2. Government regulations now require a large percentage of buildings to be fitted in this way, making it a dominant style of architecture today. The artificial parts of this outer layer can also adjust to wind conditions, temperatures, moisture levels and sunlight in order to produce optimal thermal comfort for the human and animal occupants. Algae bio-fuel cells adorning the facade can also absorb CO2 while acting as an additional source of electricity. Buildings are integrated into the city around them in a number of ways. Fuel restrictions and other factors have led to increasingly socialised transportation. The bottom floors of most towers have dedicated public car share (AI controlled) and bike share facilities, while bus and other mass transit stations are often built into the structures themselves. Pedestrian sky-walkways feature heavily in most modern cities, improving access and permeability of the urban realm, while shielding walkers from the elements. If ornamented with foliage, they can also function as elevated parks and gardens. Buildings are making cities more comfortable and inviting in various other ways. By tightly controlling a tower's reflectivity, heat absorption and heat balance, for example, planners can significantly reduce the temperatures associated with urban heat island effects. This comes at a time when temperatures in less developed cities are soaring from the combined effects of climate change and urbanisation. The average modern building in 2050 is seamlessly integrated into a city's power supply, acting as another node in a city-wide smart grid. Nearly all buildings are able to transmit locally produced energy back into the system. Wireless electricity transfer is also common, with energy beamed invisibly between buildings, which eliminates the need for unsightly poles and cables. AI systems within each building direct its total power consumption, adjusting according to the varying needs of occupants and considering even the most minor of details. Overall, this new smart infrastructure is helping to drastically improve the nature of urban living. Cities following this model are becoming far more liveable, clean, efficient, and modernised. Though many regions have collapsed into chaos, others are now leading the way in providing a more sustainable path for humanity. Smaller, safer, hi-tech automobiles. Increased living costs, lifestyle changes and environmental factors have resulted in smaller, more energy-efficient cars that are usually rented on demand rather than owned. More people than ever before are choosing to live and work alone, while the number of children per couple has also declined, which is reducing the demand for larger and more expensive vehicles. Practically all cars in the developed world are now self-driving and either hybrid or pure electric, while traffic flow and other road management issues are handled by advanced networks of AI. The resulting fall in congestion has boosted economies by billions of dollars. Air pollution has also declined greatly. The inherent safety of being controlled by machine intelligence, rather than human hands, allows for greater speed of travel. An increasing number of countries are removing the speed limits on highways. Even when crashes do occur, which is extremely rare, built-in safety features and tougher materials (e.g., carbon nanotubes) mean that fatalities are becoming virtually non-existent. Major advances in air travel comfort. Commercial airliners of 2050 are safer, quieter, and cleaner than those of earlier decades. The vast majority are based on some form of renewable energy. Additionally, travel times have greatly improved. Hypersonic engines have seen further development, aided by the rapid growth of artificial intelligence and the resulting advances in computer-automated design evolution. It is now possible to reach anywhere on the planet in under 2.5 hours. The interiors of most planes are breathtakingly luxurious compared to those of earlier decades. New materials have enabled the use of transparent walls and ceilings, flooding the fuselage with natural light. Seating areas are beautifully spacious and filled with a range of interactive technology. When flights are running below full capacity, any unneeded seats are automatically shuffled to the rear, collapsed, and hidden from view. The remaining seats are redistributed, rearranging themselves to offer everyone the maximum possible legroom. These seats can also morph to perfectly fit passengers' bodies. They can re-energise travellers with vitamin and antioxidant-enriched air, mood lighting, aromatherapy, and acupressure treatments. In the mid-section of the plane is a hi-tech zone offering a range of activities from virtual golf to conference facilities and bar/lounge settings. Continent-wide "super grids" provide much of the world's energy needs. The need for reliable, clean, cost-effective energy has led to the creation of electrical "super grids" across much of the world. These allow nations to share power from abundant green sources and distribute it to those regions most in need. By cooperating in this way, it is possible to greatly reduce waste and to optimise power supplies on a continent-wide scale, at all times of the year. For instance, winter games in the North Sea can provide a surplus of wind power, which is complemented by the summer winds of Morocco and Egypt. Meanwhile, solar panels in northern Africa generate three times the electricity compared with the same panels in northern Europe, due to much greater intensity of sunlight. Up to 100 GW of power is being supplied from Africa to Europe in this way. Similar large-scale infrastructure is now in place throughout America, Asia, and other parts of the world. Long distance transmission technology has seen major advances over the decades. Each country is connected to the grid using high-voltage direct current (HVDC) transmission, instead of traditional alternating current (AC) lines. This results in far greater efficiency since DC lines have much lower electrical losses over long distances. China completes the largest water diversion project in history. The South-North Water Transfer Project – proposed almost a century ago – is finally completed in China this year at a cost of over $60 billion. This becomes the largest project of its kind ever undertaken, stretching thousands of kilometres across the country. Its main purpose is to divert water from the southern region of China to the dryer north. It is hoped that this will spur economic growth and stability in the more populous northern area, where the per capita share of regional water has declined to near-crisis levels. It consists of an extensive system of tunnels, dams, reservoirs, and canals, all connecting and diverting water from China's largest rivers – including the Yangtze, Yellow and Hai River. At its peak capacity, the entire system can move nearly 45 billion cubic metres of water annually. First proposed by Mao Zedong in 1952, the project was officially approved in 2002. The first stage of construction, the 717 mile (1,155 km) long eastern route, was completed in 2013. This begins near the mouth of the Yangtze, crosses through the Yellow River and ends at the Beijing-Tianjin Metropolitan area within the Bohai Economic Rim. This brings much-needed water to one of the largest and most high-density conurbations in the world. Along with the construction of new tunnels and pumping stations, the Grand Canal was upgraded in order to accommodate the increased flow of water. Adding to this is the central route, completed in 2014. This brings water from the Danjiangkou and Three Gorges reservoirs, as well as the Han River, north to Beijing and its neighbouring provinces. This totals 787 miles (1,267 km) in length and by 2030 was diverting over 13 billion cubic metres of water annually. The third and final stage to be completed is the 310 mile (500 km) western route. This involved working on the Qinghai-Tibet Plateau – from 3,000–5,000m above sea level – and posed major engineering and climatic challenges. This route diverts water from the headwaters of the Yangtze to the parched eastern plateaus. Like the Three Gorges Dam before it, the South-North Water Transfer Project receives heavy criticism. In addition to environmental damage through mining, construction and pollution, there are worries about the increased potential of floods in certain areas and droughts in others. Also of concern are the hundreds of thousands of people displaced from their homes during construction. Meanwhile, other diversion projects in the south have provoked conflicts with neighbouring countries. Many doubted that China had enough water to begin with to make the project worthwhile. Indeed, by the 2050s, southern China itself is beginning to feel the effects of melting Himalayan glaciers and drying conditions. As a result, the water diversion project rarely operates at full capacity, primarily acting as a way to evenly distribute water around China, easing tensions between the inland and coastal regions. While of some benefit to China now, in the coming years, even projects of this magnitude will be insufficient to prevent serious water shortages. Longer term, only desalination will be able to save the country.

2051: The U.S. population reaches 400 million. By the early 2050s, the population of the United States has reached 400 million. This follows earlier milestones in 1915 (100 million), 1968 (200 million) and 2007 (300 million). In terms of world rankings, the country remains unchanged from the previous hundred years, in third place behind China and India. However, its share of the global population has decreased from 6.3% in 1950, to 4.6% in 2000 and 3.9% now. The U.S. in 2051 is more racially and ethnically diverse than ever, white people having declined in number and become a minority. Those of Hispanic origin have nearly doubled since 2015 and continue to be the fastest growing ethnicity. Black, Asian, and other minority groups have also increased, though at a much slower rate than Hispanics. The average U.S. citizen is also older, with a far larger percentage of people now aged 65 or above. Even the youngest Gen X-ers are now in their 70s, while the youngest Millennials have entered middle age. The number of centenarians (people aged 100 or over) has risen more than four-fold, from 85,000 in 2015 to 380,000 in 2051. Alongside a jobs crisis, resulting from automation and technological unemployment, this has put a huge burden on the working-age population – necessitating major reforms in taxation and social welfare. After a number of failed attempts, a form of universal basic income (UBI) has now finally been adopted by most states, though the U.S. was late in this process compared to most other nations in the developed world. America remains a majority Christian nation, but has seen a nearly 10% increase in those unaffiliated with any organised religion. The Muslim population has increased too but remains a small segment overall. In addition, new religions and cults have formed, often based on worship of AI, the 'Singularity' or certain technological niches. Recent advances in mainstream brain-computer interfaces, nanotechnology-based drugs and/or virtual reality have made it easier for leaders to manipulate vulnerable followers, a fact brought into sharper focus by a related government scandal around this time. As in the rest of the world, urbanisation has continued to increase, with more and more people living in cities. Across the country, enormous "megaregions" have now emerged, producing metropolitan areas that often overlap and stretch across several states. The biggest of these – Great Lakes – has grown by over 28%, from a population of around 55.5 million in 2010, to more than 71.2 million. The Northeast (encompassing Boston, New York, Philadelphia, Baltimore and Washington, D.C.) has expanded by over 35%, from 52.3 million in 2010 to 70.8 million now. California has seen even faster growth of between 50% (north) and 61% (south) – but remains split into these distinct halves. If combined, the two megaregions would now have a total population of 60.5 million people. Most of these megaregions are now linked by high-speed rail, which includes a number of 'hyperloop' routes, breaking down geographical barriers and enabling people to commute from longer distances than in earlier decades. However, America still lacks the kind of extensive networks enjoyed by the likes of Europe and East Asia. Despite many advances in technology, the U.S. remains a deeply divided country, riven with social and economic problems. The level of income inequality, already a serious concern at the start of the century, has widened even more, as the rate of return on capital continues to exceed the rate of economic growth over the long term. The first trillionaires have begun to appear in Forbes and other rich lists. This ever-increasing concentration of wealth in the hands of the upper 1% has further eroded the middle class. Alongside a perpetual cycle of left-right parties offering the same false promises, little has fundamentally changed in American politics. A general apathy has maintained low election turnouts, now averaging just 40% or so (compared with 50% at the start of the 21st century and 60% during the mid-20th). Recent decades have seen an intensification of the phenomenon known as "truth decay": an increasing disagreement about facts. This trend has been driven by ongoing polarisation (both political and demographic), as well as the Internet, social media, virtual reality, and other technologies, which have accelerated the spread of disinformation. Fake video scenes, for example – indistinguishable from real life, and even made interactive through strong AI – are now so cheap, accessible, and easily produced that it has eroded the trust in formerly respected sources of factual information, blurring the line between opinion and fact. All of these trends are further harming civil discourse, causing disengagement and alienation, political paralysis, and uncertainty. Americans are routinely bombarded with information overload, often highly personalised and tailored to their individual circumstances in ways that would appear unsettling and surreal to observers from earlier decades. More and more people feel disillusioned by what they see as a hyper-commercialised, intrusive, manipulative society. With oil, gas and coal no longer playing a significant role in the U.S. economy, the lobbying power of fossil fuel companies has essentially disappeared. However, new lobbyists have taken their place – such as biotech and robotics companies, seeking to change regulations and monopolise their industries. The explosion of genetic upgrades, implants, beauty treatments and other personal tech has become yet another driver of inequality. The impact of these various trends has been felt most strongly in the southern 'Bible Belt' states. These are hampered by debt and poor economic growth, exacerbated by worsening environmental conditions. Many people are now flocking to the north, which is more politically stable. Some of those already living in the north are moving into Canada, which has even better long-term prospects. The U.S. population, as a whole, continues to increase for the rest of the century, though at a slower rate than before. An interstellar radio message arrives at Gliese 777. The Yevpatoria RT-70, located at the Centre for Deep Space Communications in Ukraine, was among the largest radio telescopes in the world, with a 70m antenna diameter. On 1st July 1999, it beamed a noise-resistant message named "Cosmic Call 1" into space. This was sent towards Gliese 777, a yellow subgiant star, 52 light-years away in the constellation of Cygnus. At least two extrasolar planets were known to be present in this system. In April 2051, the message arrives at its destination, for any potential alien civilisations to hear and decode it. Britain holds its centennial national exhibition. A centennial national exhibition is held in the UK, in keeping with the precedent set by the Great Exhibition of 1851 and the 1951 Festival of Britain. The opening ceremony is attended by King William V, now aged 69. The Suburban Rail Loop is operational in Melbourne. The Suburban Rail Loop is Australia's largest ever transport project. It consists of a 90 km (56 mi) line running through suburban Melbourne and surrounding the central business district (CBD). The loop is designed to help commuters connect easily to major suburban hubs and amenities such as hospitals, shopping centres, universities, and the airport. The service connects almost all existing railway lines via one new route through middle suburbs, thus removing the necessity to travel into the city and out again as per the older hub and spoke rail network. The line connects the existing station at Cheltenham with other existing stations at Clayton, Glen Waverley, Box Hill, Heidelberg, Reservoir, Fawkner, Broadmeadows, Sunshine and Werribee. It also links to new stations, built in areas that were long promised rail connections, including Monash University, Burwood, Doncaster, Bundoora, and Melbourne Airport. The Suburban Rail Loop was first proposed in August 2018 at a cost of AU$50 billion and received strong public support. Like many government projects, however, it inevitably went over budget. Construction started by 2022, with sections progressively opening. By the early 2050s, the full line is operational and handling 400,000 passengers a day – relieving pressure on existing city-bound trains, while also taking 200,000 vehicles off congested roads.

2052: The biggest supermoon of the 21st century. A so-called "supermoon" occurs when a full moon coincides with a lunar perigee – the closest approach of the Moon in its elliptical orbit around Earth. Such events produce the largest apparent size of the lunar disk as seen from Earth, making it appear 15% larger and 30% brighter than during apogee (its most distant point from Earth). This provides great opportunities for astronomers and photographers. Particularly dramatic supermoons occur when a full moon and lunar perigee also happen when Earth is at perihelion (its closest point to the Sun for the year). On the morning of 14th November 2016, the distance between the centre of the Moon and Earth was 221,524 miles (356,509 km), the closest they had been together during a full moon since 1948. The Moon would not appear this large again until 25th November 2034. The closest supermoon of the century occurs on 6th December 2052. Notable supermoons are also observed in 2070, 2088 and 2098.

2053: Genetically engineered "designer babies" for the rich. The ability to manipulate DNA has come a long way since its discovery in 1953. A century on, wealthy parents now have the option of creating "perfect" babies in the laboratory. This is done by picking and choosing their best hereditary traits. Gender, height, skin, hair, and eye colour – along with hundreds of other characteristics – can be programmed into the embryo prior to birth. The embryo is then grown in an artificial uterus. The most advanced (and controversial) techniques involve manipulating the brain to improve the child's intelligence, behaviour, and personality. Many conservative and religious groups decry what they see as the commercialisation of the human body. Australia becomes the first country to eliminate cervical cancer. Cervical cancer is a cancer arising from the cervix, the lower part of the uterus in the human female reproductive system. The first known description of the illness was made by the Greek physician, Hippocrates, in 400 BCE. However, it was not formally identified as a sexually transmitted disease until 1834 and surgical treatments were not available until 1898. By the early 1900s, it was the biggest cancer killer of women in some countries including the USA. In 1943, the Pap test was first generalised as a procedure, enabling doctors to detect cellular abnormalities that might be cancerous and begin treatment before they could spread. Over the following decades, the test was credited with driving down cervical cancer death rates in much of the developed world. A link between human papillomavirus (HPV) and cervical cancer was identified in the 1970s when German virologist, Harald zur Hausen, demonstrated the presence of HPV DNA in cervical cancer cells and genital warts. However, it was not until the mid-1990s that the primary role of HPV in the development of cervical cancer was definitively confirmed. Subsequently, HPV was identified in 99.7% of cervical cancer specimens worldwide. Zur Hausen's breakthrough in medical science was important enough to earn him the Nobel Prize for Physiology or Medicine in 2008. In the mid-1980s, an HPV vaccine was developed, in parallel, by Georgetown University Medical Centre, the University of Rochester, the University of Queensland in Australia, and the U.S. National Cancer Institute. In 2006, the U.S. Food and Drug Administration approved the first preventive HPV vaccine, under the trade name Gardasil. This was 99% effective in protecting against the most common strains of HPV that caused cancer in women. Other versions of the vaccine became available in subsequent years, protecting against various strains. While progress was being made in richer countries, cervical cancer remained a significant problem globally. During the early 21st century, it was both the fourth most common cancer and fourth most common cause of death from cancer in women. In 2012, an estimated 528,000 cases were reported and 266,000 deaths – about 8% of the total cases and total deaths from cancer. About 70% of cervical cancers occurred in developing countries and it was a leading cause of cancer death for women in low-income nations. Australia was among the countries at the forefront of the fight against cervical cancer. In 2007, its government began a school-based vaccine program that virtually eliminated new infections of HPV for those who were immunised as young teenagers. In 2008, the Australian Cervical Cancer Foundation (ACCF) was founded with a goal to promote women's health "by eliminating cervical cancer and enabling treatment for women with cervical cancer and related health issues in Australia and in developing countries." Ian Frazer, co-inventor of the Gardasil vaccine, was a scientific advisor to the ACCF. The country had also developed a world-class screening program that was 100% effective at preventing cervical cancer for women who fully committed to it. A new vaccine was introduced in 2018 to target more virus strains of HPV that caused cancer. In the 2010s, less than 1,000 cases of cervical cancer were being reported in Australia each year. This figure, alongside the mortality rate, continued to fall, reaching negligible numbers in subsequent decades. By the early 2050s, cervical cancer has been essentially eliminated in Australia – the first country to achieve this milestone – thanks to the combination of highly effective screening and vaccination programs. Following the example of Australia, universal testing and immunisation programs are now ubiquitous in the developed and most of the developing world, providing hope that the illness can be eliminated globally by the end of the century.

2054: Rainfall intensity has increased by 20%. As the world warms, the increased evaporation is putting greater amounts of water vapour into the atmosphere. Rainfall intensity rises by 7% for each degree of additional warming. With temperatures approaching 3°C (5.4°F) above the 20th century average, the most extreme rainfall events are now 20% more intense than before. Dramatic increases in surface runoff, peak river flows and flash flooding are being experienced around the world – exacerbating soil erosion and putting huge pressure on drainage and sewage systems. This additional rainfall is a particular problem in the tropics and poor regions with insufficient infrastructure or flood defences.

2055: Spaceflight has taken a leap forward. Environmental catastrophes, overpopulation, war, and other crises have made humanity painfully aware of the limitations on its home planet. Many now believe that exploring and settling space could be a way to alleviate some of Earth's immediate problems. As a result of this, spaceflight has advanced considerably since the beginning of the century. National governments are able to participate to a certain extent, but huge levels of debt and economic stagnation have left the bulk of the effort to private enterprises and wealthy individuals. The cost of launching material into space has declined considerably by now. Advances in materials technology, greatly improved fuel efficiency for rockets and the proliferation of single-stage-to-orbit spacecraft have all contributed to this fall in prices. Automated design evolution, facilitated through artificial intelligence networks (enabling rapid synthesis of optimal design requirements) has also played a role. This is allowing much greater frequency of flights, as well as heavier payloads. One result of this has been a rapid growth in space tourism, with journeys available to even middle-income citizens. For the super-rich, even excursions to the Moon's surface are now possible. Lunar bases, already established in previous decades, have been expanded. In addition to room for tourists, new scientific modules have been added with greenhouses, ice harvesting stations for water, and solar arrays built from lunar regolith. Corporate interests are now looking to exploit the Moon commercially. Though human presence is still confined to the poles, a number of prospecting missions are underway in preparation for mining operations. Other long-term plans include solar power stations capable of beaming energy directly to Earth. In the more distant future, these may expand to completely encircle the Moon. Asteroid mining has now evolved into a huge industry, with major firms competing in the business. Thanks to progress in rocket technology and robotics, countless rendezvous with near-Earth and main belt asteroids have been conducted. A wide range of metals and minerals – including gold, platinum, nickel, iron, zinc, antimony, copper, cobalt, and phosphorus – are being recovered. Some of these materials became so rare on Earth that demand made them exceedingly valuable. This drove accelerated exploration. Swarms of automated probes are now involved in prospecting and mining on a constant basis. Most asteroids are processed in situ, as opposed to Earth orbit, due to fears of an accidental impact. For now, manoeuvring larger asteroids is seen as expensive and unnecessary in any case. Water-rich asteroids are particularly useful as the constituent hydrogen and oxygen can be turned into rocket fuel. As part of the commercialisation of space, numerous fuel depots are in place around the Earth-Moon system and Lagrange points. These are further reducing the cost of spaceflight, with most ships only required to carry enough fuel to get into orbit. Longer and more complex missions are possible with supplies available en route. Asteroid mining has proven to be one of the great confirmations of people's hopes for outer space. A single rock just a mile or so in diameter may yield more platinum group metals than has ever been mined on Earth, and more fuel than every rocket launch in history. The resources now being added to the global economy are helping to meet demand in many areas. However, significant portions of raw materials are being diverted to off-planet projects including the construction of new space stations. As a result, various non-profit groups have sprung up, aiming to ensure that poorer nations can benefit from space, not just the countries and rich individuals that can afford to go. The lucrative nature of this business and its growing influence on Earth has led to the passing of major new regulations, antitrust and monopoly laws. Ongoing conflicts around the world have spurred military powers to new heights. Developed nations are now turning to space to gain the advantage in next generation warfare. The USA is prolific in this regard. In earlier decades, international treaties prevented the militarisation of space. However, some of the more powerful nations have moved projects forward in secret. In any case, the volatile and rapidly evolving political climate has led to new agreements being introduced. A whole new dimension to war is emerging, in parallel with the commercialisation of space. The USA, for example, has established a comprehensive network of spy satellites, each equipped with a wide array of sensors able to observe people and objects on the ground with astounding resolution and detail. AI controls this system, automatically tracking known persons of interest and monitoring for suspicious activity. If enemy actions on the ground cannot be rationalised by the AI, government and military personnel are notified of it. Now controlling the most advanced and intelligent surveillance system in the world, America has regained some of its lost influence on the world stage. Naturally, other countries object to what they see as a potential for abuse of power. Alongside the spy satellites are manned space stations, placed strategically in geosynchronous orbit. These act as command centres, able to view battles from above in real time while giving directions to forces on the ground. Notably, they allow the military to organise and deploy squadrons of autonomous aircraft and robots. They also ensure that there are repair crews constantly on call, in the event of spy satellites or other craft malfunctioning. In today's hi-tech, fast-moving wars, communication and information are of unparalleled importance. These stations act as intelligence centres of sorts, and as such become prized targets for enemy forces. Knowing this, no expense is spared when it comes to advanced shielding and warning systems. The first space-based weapons systems are also in place. Most incorporate traditional missile capabilities, but other, more experimental systems are being utilised. One such weapon is an orbital kinetic bombardment platform operated by the US. This consists of two satellites in parallel orbits. The first provides a target and communication function, taking instructions from the ground on potential enemies. The second satellite is armed with several 20-foot long, specially reinforced tungsten rods, each complete with tail fins and an internal guidance computer. Upon instruction, a rod is released over a ground target and begins to fall. Using nothing but gravity and pure force, these missiles can impact with the power of a tactical nuclear warhead – only without the deadly fallout. Almost no bunker is safe, as the weapon lands with utterly devastating force, penetrating deep underground. Several problems needed to be overcome before this system was operational, such as the cost of lifting the materials into space and creating rods of sufficient strength to withstand the energies involved. Years of experimentation have finally yielded a metal alloy tough enough to be used. This project – the ultimate in bunker busting technology – is highly secretive and only tested in the deserts of the American southwest. Orbital solar power, since its introduction nearly 15 years ago, has grown considerably. Various new stations are now in place, able to provide continuous power to Earth. In addition to commercial power production, orbital solar has been found to have military applications. Modern armies no longer require a fixed source of power, with energy beamed down to even the most remote locations. Naturally, these solar power systems, as well as much of the other activity in space, increase the danger of space junk. However, genuine solutions are now finally appearing that can eliminate such debris once and for all. One option is the use of ground-based lasers, which move larger pieces into decaying orbits to eventually burn in the atmosphere, while massive aerogel nets guided by satellites can sweep up the finer particles in orbit. AI and robotics have played a major role in opening up space. Automated construction of spaceships and supplies is another area of reduced expense, while AI is used by asteroid mining companies to operate their wide range of robotic explorers and miners. In general, the increasingly complex and chaotic network of spacecraft now in place has necessitated the use of strong AI to coordinate operations. Many jobs previously filled by humans have now been given over exclusively to robots and computer programs. Spaceships are almost never piloted by human hands, with everything from docking to refuelling to landing completely automated. Naturally, many passengers view this level of AI control with trepidation, cultural fears still being a part of space travel. Farther away from Earth, numerous robotic probes, each equipped with their own AI, are exploring the planets, moons, and asteroids of the outer Solar System. These are yielding unprecedented amounts of empirical data about the nature of these bodies and the early history of the Solar System. Space-based telescopes have seen phenomenal improvements over the decades. Exponential progress has led to telescopic power increasing by a factor of over 100,000. Thanks to this astounding rate of advancement, it is now possible to view extrasolar planets in close detail. future extrasolar planet detection. The number of known planets beyond our Solar System – about 800 in 2012 – has grown to 13 million by 2055. Thousands of these bodies have been observed in the habitable "Goldilocks" zones of their respective star systems, including a number of Earth-like planets with liquid water oceans and active hydrological cycles. The possibility of finding alien life expands greatly during this time, as does the hope of achieving contact with intelligent civilisations. Despite the huge progress in this field, however, humans have still barely scratched the surface of the Milky Way. Recent progress has been achieved with antimatter propulsion – making human travel to the outer Solar System a real possibility in the coming years. A permanent base on Mars is in the late planning stages, set to be established by a consortium of national governments and corporate interests. Longer-term projects are now being considered, with international talks being held over the future construction of a space elevator, to be located on the equator. Corporations are also looking to the massive, untapped wealth of the gas giants and outer Solar System as an eventual goal. By all accounts, private interests are driving a new era of space exploration. Rapid progress in science and technology, combined with surging demand for resources, imply that humanity is well on its way to becoming a space-faring civilization. The vast majority of countries have achieved democracy. The on-going spread of information – aided by mobile telecommunications, social media, and other technology – continues to nurture democracy. The vast majority of countries now have free and fair elections. However, this general upward trend has begun to plateau in recent decades. Climate change is now having a significant impact on regional stability, particularly in Africa and the Middle East, where concerns over scarcity of resources have created conditions allowing dictators and authoritarian governments to make a comeback. In any case, a number of cultures are simply more compatible with monarchies, theocracies, and autocracies at the present time. These parochial nations will remain undemocratic for some time to come. Global population is reaching a plateau. The global population is stabilising at between 9 and 10 billion. Most of the recent growth has occurred in the developing world. However, better education along with improved access to contraception, family planning and other birth control methods is now markedly reducing the number of children per couple. Information technology has played a major role in boosting literacy levels and spreading knowledge to the world's poor. The global population is also getting older, putting a huge strain on government welfare systems and employment. More than a fifth of the planet is aged over 60 now – and with so many breakthroughs in medicine, this trend will only continue. More than two-thirds of people live in urban areas by this time, compared with 50% in 2010, and there are vast, sprawling megacities in all corners of the globe. In the very densest parts of the world, the tallest skyscrapers reach thousands of metres in height, are occupied by millions of people, and are effectively cities in their own right with self-sufficient energy and food production. Many residents within these towers spend almost their whole lives in these buildings, with little need or want to venture outside. The population of the USA has reached nearly 450 million now (up from 309 million in 2010), with Hispanics doubling their share of the population to 30% and Asians going from 5% to 9%. Non-Hispanic whites have become a minority, with their share dropping to below 45%. They made up 85% of the population in 1960. Due to climate change, living standards have been highest in the northern states, which have better access to water and are generally more stable. California, Arizona, New Mexico, and Texas have seen huge declines in wealth and influence. Despite recent advances in energy, food production and other technology, there are still widespread conflicts around the globe – due to a rapidly worsening environment, coupled with a host of socio-political issues as the world struggles to adopt a more sustainable economic paradigm. Huge shantytowns have formed in some countries, with millions of people displaced. The worst-affected regions are so destitute that they have been reclassified as "fourth world" countries. Desperate attempts are now underway to sequester carbon from the atmosphere in the hope of reversing the effects of global warming. Traditional media have fragmented and diversified. By the mid-2050s, traditional Western news corporations no longer exist. News gathering, analysis and distribution has fragmented – shifting to millions of creative individuals, bloggers, citizen journalists and small-scale enterprises. These work cooperatively and seamlessly, utilising a “global common” of instantly shared knowledge and freely available resources. This includes information retrieval not only from cyberspace, but also in the real world; embedded in everything from webcams and personal digital devices, to orbiting satellites, robots, vehicles, roads, streetlamps, buildings, stadia, and other public places. Even people themselves have become a part of this collection process. Bionic eye implants (for example) can relay data and footage on the spot, in real time, from those willing to participate. Traditional Western TV channels have largely disappeared, replaced by unique "personalised" web channels, covering practically any subject or combination of subjects imaginable. These are filtered and customised to the exact tastes and requirements of the individual and are viewable anywhere, at any time. They can be highly interactive and are often experienced in virtual reality settings, rather than on a screen. This is especially true of movies, many of which have non-linear plotlines allowing the viewer to influence the outcome themselves, or even to become characters within the film. Mass advertising, too, has undergone a revolution in Western societies. Some of the oldest outdoor media still exist – such as posters, billboards, and leaflets – which continue to survive in holographic and other forms. However, online web and televisual product/service information is now accessed almost entirely from on-demand, advanced customer feedback networks along with automated, semantic web assistants. Together these can provide instant, factual, and trustworthy information on a highly personalised level: automatically filtering any marketing bias or corporate propaganda which might have influenced a consumer in the past. Despite the increased choice and empowerment, one major consequence of this fragmentation (a trend which began in the 1980s) has been increased isolation of the individual. A decrease in the shared experience of media has led to a further decline in Western family life. Poorer nations are still reliant on traditional forms of media, marketing, and information exchange. In the near future, however, they too will make the transition – thanks to rapidly improving access to web technology.

2056: Global average temperatures have risen by 3°C. Global warming has begun to race out of control with temperatures fed by increasingly strong feedback mechanisms. Melting permafrost in the Arctic is now releasing vast amounts of methane – a greenhouse gas more than 70 times stronger than CO2. Plants are decaying faster in the warmer climate, while the oceans are liberating ever greater quantities of dissolved CO2. The Earth is now the hottest it has been since the mid-Pliocene, over 3 million years ago, and there is permanent El Niño conditions – resulting in widespread, extreme weather events in regions around the world. Severe droughts, torrential flooding, hurricanes, and other disturbances are now a constant feature on the news. Southeast Asia, the Middle East and Africa are the places most affected. Developing countries dependent on agriculture and fishing – especially those bordering the Pacific Ocean – are particularly badly hit. In Pakistan, a calamity of epic scale is unfolding. The nation has been declared a failed state, its government having lost control, with armed gangs seizing what little food and water remains. Tens of millions of refugees are attempting to leave the country as rivers run permanently dry. India has fared little better. The country’s agriculture is now under severe stress, with monsoons ranging from extremely wet seasons to extremely dry ones. In the more intense wet years, the flooding is catastrophic, submerging vast areas of land. In America, the east coast is being hit particularly hard now. Chesapeake Bay – the largest estuary in the country – has been devastated by recent flooding disasters, rising sea levels and storm damage. The economies of Maryland and Virginia have suffered greatly. Much of the Gulf Coast region has been abandoned, while droughts are worsening in the southwest of the country. More than 15m Americans now qualify as displaced persons. A surge in migration to Canada is underway – one of the few areas of the world that still offers somewhat favourable environmental conditions. In Europe, food riots have continued to spread. Temperatures that were previously found only in North Africa and the Middle East have become the norm in central and southern parts of the continent. Britain now has a Mediterranean climate and is engaged in a food-sharing process with its neighbour Ireland. Rising sea levels, erosion and storm surges are wreaking havoc on the coastline. Australia is being plagued by extreme heatwaves. The country is experiencing severe and prolonged droughts, together with a huge increase in wildfires and dust storms. The elderly is especially at risk from this hotter and drier weather. The Arctic seas – which became ice-free in September months by the 2020s – now have ice-free conditions throughout the year. This has made the region attractive to shipping and exploitation of natural resources, with various new trade routes being opened up. Iceland is benefiting from this, becoming like Singapore in some ways. Many previously uninhabited islands in the Arctic are now being colonised.

2057: Smart clothing is a trillion-dollar industry. "Smart" clothing and electronic textiles (or e-textiles) came to public attention in the opening years of the 21st century, with various items of apparel being demonstrated at the research and development stage, or as early consumer products. This form of wearable technology offered the potential to enhance a user's everyday experiences in ways that traditional fabrics could not. Initially targeted primarily at health and fitness enthusiasts, smart clothing gradually expanded into other areas – including medicine and patient monitoring, fashion, entertainment and gaming/virtual reality, workplace applications and military uses. It could generally be divided into two main categories: aesthetic and performance enhancing. Aesthetic examples included fabrics that lit up and fabrics able to change colour. Some of these fabrics gathered energy from the environment by harnessing vibrations, sound, or heat. Others worked by embedding the fabric with electronics to power it. Performance enhancing textiles, intended for use in athletic, extreme sports, and military applications, included fabrics designed to regulate body temperature, reduce wind resistance, and control muscle vibration. Additionally, companies began to develop smart fabrics to guard against extreme environmental hazards, such as radiation and the effects of space travel. Innovations also included drug-releasing textiles, as well as fabrics for the health and beauty industry with moisturiser, perfume, and anti-aging properties. The first generation of smart apparel had relatively modest capabilities – attaching a sensor to a garment or shoe, for example. The second generation had more advanced products with sensor(s) embedded within the garment. In subsequent generations, the garment itself became a sensor. A growing number of companies began to create flexible, stretchable, waterproof electronics with hydrophobic coatings able to repel water. Not only could these survive a laundry cycle or inclement weather, but they also had wireless power inputs, without the need for any batteries. Progress with sensor technology, new materials, artificial intelligence (AI), motion tracking and haptics led to increasingly sophisticated products that increased the functionality and consumer appeal of smart clothing. While it remained a small fraction of the overall clothing market, a compound annual growth rate (CAGR) of 20% enabled smart clothing to gain ever more market share. This allowed it to expand into a multi-billion-dollar industry in the 2020s and reach an inflection point during the 2040s. By the late 2050s, smart clothing has become a trillion-dollar industry and continues to see faster growth compared to traditional clothing. Almost everyone in the developed world now has at least one item of smart clothing. The technology is now so cheap that the developing world is catching up too, much like smartphones became ubiquitous globally in the 2010s. Some of the most common and popular items include garments able to continuously monitor a heart rate, breathing, and other vital signs, alerting a hospital or medical professional to the first hint of trouble and ensuring a wearer gets the care they need. Longer term conditions can also be monitored from data on the strength and geometry of movements, for example. Other clothes feature embedded photovoltaics, serving as wearable power supplies. This is especially useful for camping trips or other excursions. In a world of higher temperatures due to climate change, smart clothing can also provide cooling capabilities. Meanwhile, scientists in the field can touch samples and obtain genetic sequences in a matter of seconds, while forensic teams can investigate crime scenes and improve the gathering of evidence, using smart gloves. The world of fashion is also transformed, with garments able to quickly change appearance and texture, depending on a wearer's preference. Some of the more advanced clothes even incorporate video or holographic displays, providing opportunities for striking designs. In addition to gaudy catwalk creations, this also includes subtler and less showy versions, such as on-person buttons, wrist interfaces, holographic watches and so on. Smart clothing is also used in retail, catering, entertainment, and other venues – to generate name badges, for example, or change the appearance of a uniform. Seemingly endless applications are now possible, thanks to ongoing advances in technology. Smart clothing has grown from being a small and niche segment of the market in earlier decades, to a common part of many apparel and footwear ranges in the 2050s. By the 2070s, the majority of clothing will be smart, as the world becomes ever more saturated with biometric and other data. Handheld MRI scanners. The ability to scan, analyse and diagnose the body has taken a huge leap forward by now. Hi-res, 3D imaging of internal structures and brain activity is now possible using real-time video, rather than static photos. This can be accomplished with devices no bigger than a camera or tablet. In the late 20th and early 21st century, these machines were so bulky that they filled whole rooms. Scans typically required half an hour or longer to create. They were also expensive: upwards of a million dollars for a state-of-the-art model, with each individual scan costing hundreds of dollars. A new generation of machines began to evolve, based on supersensitive atomic magnetometers, detecting the tiniest magnetic fields. These replaced the enormous doughnut-shaped magnets used in the past. By the late 2050s, MRI scans have become as quick and easy as taking a photograph, with a hundredfold decrease in cost. This is allowing healthcare programs in developing countries to benefit particularly.

2058: A billion human brains can be simulated in real time. The late 20th and early 21st centuries witnessed orders of magnitude increases in computer power and data storage. Each new generation of chips was smaller and more energy efficient than the last, resulting in ever larger and more complex applications. This trend was known as Moore's Law and it led to the gradual emergence of artificial intelligence, combined with brain simulations down to the level of single neurons. Despite occasional setbacks, the exponential progress in computational power continued in subsequent decades, driven by further innovations in the miniaturisation of components, new system architectures, new materials, and new cooling methods. By 2058, a billion human brains can be modelled in real time on a single machine, at the level of individual neurons. In recent years, however, a physical bottom limit for transistor size has been reached, meaning that computers can only be made more powerful by becoming larger in size. This decade sees a profound change in the role of supercomputers – the very largest and most powerful computers – as they seem to take on a life of their own, expanding their infrastructure and software in ways that significantly influence local, regional and world affairs. This is raising major concerns regarding possible existential risks and unforeseen consequences. Until recently, global politics and economics were determined largely or entirely by human thought and emotion. However, it is becoming clear that new forms of machine super-intelligence and hybrid human-AI mergers are beginning to reshape the cultural zeitgeist. Computers are now so powerful that many high-level tasks in business and government are being delegated to them. Large-scale brain models can be used to gauge the likely response of a nation's entire population to new ideas, products, or hypothetical events, for example, or to test new biotechnology implants – often designed by the supercomputers themselves. While a truly accurate brain simulation (i.e., at the subatomic level) has yet to be perfected, the states of protein complexes can now be incorporated into the model of a single brain. Other applications of these supercomputers include measures to comprehensively deal with climate change, which finally starts to be reversed over the next several decades. At the consumer level, gaming devices now provide fantastically lifelike experiences. Full immersion VR is now a mainstream phenomenon, after seeing rapid development over the last two decades. Advances in procedural generation have led to Matrix-style worlds of breathtaking scale and ingenuity. Entire new societies have formed in cyberspace, with many people spending their whole leisure time engaged in them. When encountering a player or character online, it is practically impossible to distinguish between human and machine intelligence. The Beatles' music catalogue enters the public domain. Copyright law has remained largely unchanged since 2019. Accordingly, the Beatles' songs from 1962 are entered into the public domain, 96 years after the band's first single. A radio telescope is built on the Moon. By 2058, a radio telescope measuring hundreds of metres wide is being developed on the Moon's far side. This provides a stable platform with slow rotation rate (0.5 arcsec/sec), beyond the interference of Earth's atmosphere and cluttered radio background. It can produce astronomical images with a clarity unmatched by any observatory on Earth or in space. Individual stars, billions of light years away, can be seen assembling into the first galaxies. The telescope is situated within an impact crater. Both it and the surrounding infrastructure are built using material gathered from the Moon itself – drastically reducing costs.

2059: The end of the oil age. For most of the 20th century, prospectors discovered far more oil than industrial societies could consume. This was an era of cheap and plentiful energy that saw huge growth in the world's economy and population. By 1970, however, a major slowdown in discoveries was observed. This continued into the 21st century and the industry now faced competition from renewables. By the late 2050s, the end of the 200-year oil age is being witnessed, as the final dregs of economically viable reserves are extracted. Plastics and other oil-based products have been replaced by alternatives, such as bioplastics. Mars has a permanent human presence by now. By the end of this decade, a permanent team of scientists is present on Mars. This comprises a highly international mix of people. The first civilian tourist has also arrived. Travel to Mars is now cheaper and faster thanks to new forms of propulsion, cutting journey times from six months to just a few weeks. The base will soon be expanded with additional facilities providing more energy, food production and recycling systems, along with mining equipment and other tools. Vehicles are being supplied too, improving the astronauts' mobility, and enabling them to roam hundreds of miles. More sophisticated long-term bases are now being planned to accommodate larger teams of scientists, as well as corporate interests. The habitat modules are constructed partially underground, giving protection from the Sun's ultraviolet glare. Radiation-absorbing materials based on advanced nanotechnology are used in spacesuits, as well as on the exterior of the vehicles. These same materials have filters to block even the tiniest particles of dust, providing long term protection against the environment outside. All of the above is providing the critical mass needed for self-sufficiency. Operations will soon be conducted entirely independent of Earth. In the coming years, the first children will be born on Mars.

2060: Flood barriers are erected in New York. Sea level rises and storm surges have begun to threaten even the business, financial and cultural heart of America. By 2060, what used to be a once-in-a-century type of flood is becoming a regular occurrence. This has led to the construction of sea walls, breakwaters, and locks to the south of Manhattan, including one very big lock at the harbour entrance. JFK Airport and other parts of the island are receiving protection too. This is one of the largest public works projects in US history and comes at huge cost. However, the costs of not acting would have been unimaginably greater. Many other cities around the world are enacting similar measures now. Tropical cyclones are wreaking havoc in the Mediterranean. Until now, the near-landlocked Mediterranean Sea was largely immune to the more violent forms of ocean weather. The worst storms that the sea experienced were the so-called "Medicines" – comparatively tame versions of the much larger and more destructive Atlantic hurricanes. The most notable example occurred in 1995, when a storm created a hurricane-like spiral for a short period of time, complete with an eye. By 2060, however, normal weather patterns around the world are evolving drastically as a result of climate change. With global temperatures over 3°C (5.5°F) above the 20th century average, the Mediterranean Sea is now home to a prolific hurricane basin. Warming seawater, combined with increasingly common low-pressure systems, is turning the region into an ideal incubator for tropical cyclones. These are now devastating coastal communities throughout the southern coast of Europe and the northern coast of Africa. These areas were already facing collapse due to heatwaves, chronic drought, and sea level rise. Most of Venice has been abandoned, after failed attempts to save it from sinking. Cities such as Athens, Barcelona, Tripoli, Tunis, and Alexandria will soon be following. Global extinction rates are peaking. Environmental destruction is reaching its apex now. Tropical forests are being especially hard hit, with 0.5% of animal and plant species going extinct each and every year – nearly ten times the rate seen in 2000. An aging population. In the early 21st century, around one in five of the European population was aged over 65. This meant that the pension costs, public health, and transportation needs (and sometimes the housing and social-welfare requirements) of each senior citizen were supported by taxes and other deductions from the incomes of four working-age people (aged 15 to 64). However, birth rates stayed low throughout the first half and into the second half of the century, whilst longevity was extending through better medicine, gene therapy, nanotechnology, improved lifestyles and so on. This meant that the ratio of young to old began to shrink dramatically. By 2060 there are 50m fewer workers and 67m more seniors, so the ratio is changed to one in three. In other words, only two working-age people to support each senior. This has impacted hugely on government budgets, leading to a radical overhaul of social welfare. A similar pattern has emerged in other parts of the world. Japan has faced the biggest change of all, with 40% of its population now aged over 65, double the figure in 2006. The ozone layer has fully recovered. Chlorofluorocarbons (CFCs) were invented in the 1920s. They were used in air conditioning/cooling units, as aerosol spray propellants prior to the 1980s, and in the cleaning processes of electronic equipment. They also occurred as by-products of some chemical processes. No significant natural sources were ever identified for these compounds – their presence in the atmosphere was found to be almost entirely due to human activity. When such ozone-depleting chemicals reached the stratosphere, they dissociated by ultraviolet light to release chlorine atoms. The chlorine atoms acted as a catalyst, each one breaking down tens of thousands of ozone molecules before being removed from the stratosphere. The ozone layer prevents most UV wavelengths of sunlight from passing through the Earth's atmosphere. In the late 20th century, huge decreases in ozone generated worldwide concern. It was suspected that a variety of biological impacts – such as increases in skin cancer, cataracts, damage to plants, and reduced plankton populations – resulted from the higher levels of UV exposure due to ozone depletion. This led to the adoption of the Montreal Protocol – one of the most successful international agreements of all time, which banned the production of CFCs, halons, and related ozone-depleting chemicals. Although this ban came into force in 1989, the molecules had a longevity of several decades. In 2006, the ozone hole was the largest ever recorded, at 10.6 million square miles (pictured below). It was not until 2060 that it fully recovered. Technology has transformed modern education. Exponential progress in the fields of communication, information technology and computer science continue to reshape society. Some of the most important advances have occurred in education. Although many countries are being ravaged by global warming, access to learning is now so effortless and inexpensive that – paradoxically – even the poorest and most destitute of places can take advantage of it. Schools and classrooms as people from the 20th century would know them have largely disappeared by 2060. Networking has replaced in-person learning for the vast majority of students, who now take part in decentralised, online, and virtual classes. Strong AI has supplanted most of the roles that were formerly held by human teachers. These artificial instructors have instant access to vast repositories of data and knowledge, greatly expanding the horizons of learning environments. Students are exposed to a much wider variety of culture and ideas, since classes are no longer limited by geographical proximity. Connectivity allows young people with similar interests and abilities to learn together and be optimally matched in terms of personality types. Universal translators have removed any language barriers such international classrooms would have experienced in the past. Full immersion virtual reality allows modern "schools" to exist as purely online institutions, with a seemingly infinite variety of classes and subjects. These can be experienced through self-guiding neural implants – or more commonly, by simply wearing an external device like a headset or visor. Free software and the negligible cost of hardware have brought unprecedented levels of education to Third World countries. The lack of required physical infrastructure and reduced need to pay teachers has given even the poorest neighbourhoods access to a range of study far beyond anything seen in the past. While schools and colleges still exist in the physical world, these are declining in number and have been heavily influenced by information technology. Instead of paper or textbooks, students make use of portable tablet devices with essentially limitless power and bandwidth, again at negligible cost. As a consequence, global illiteracy has fallen below 1%. As well as technology, the process of education itself has evolved to meet the changing needs of society. With the continuing trend of mechanisation, the bulk of manufacturing and physical labour has been relegated to machines. Even many white-collar jobs have disappeared thanks to the emergence of strong AI. As a result, human work is increasingly confined to subjective, abstract and/or creative professions – such as science, art, design, law, etc. This has turned high education into an absolute necessity in many countries. Methods for teaching students have changed in response. In the past, most systems of education consisted of a set period of time that people would move through school. The grades they received played a large role in determining what opportunities they would have later in life. Regardless of how well students did in school, and regardless of whether they even understood the material, they would all complete their education at roughly the same time with whatever skill level they had managed to acquire. Now, this method has been reversed. In the most developed countries, semester-based learning has been replaced with a go-at-your-own-pace style of learning. The proliferation of virtual teachers has made it possible for a person's education to be exactly tailored to their own learning abilities and interests. This avoids the issue of exceptional students being held back, and struggling students being left behind. The overall result is that time spent in school has become variable, while the level of knowledge and skill one gets out of school has been given a floor. The physical (and virtual) classroom environment itself has also changed. It is common now to have a single class be taught by more than one teacher, allowing them to play off their individual strengths and give students a broader base of information. Teaching is much more reciprocal, with students learning from teachers, learning from other students, or even imparting their own knowledge back onto the teacher. Also, the classic lecture environment has been replaced by a more hands-on approach. Much more of schooling involves practical application with teachers demonstrating exactly how their material can be used in the real world. Technology is bringing further innovations to education. A profound and world-altering paradigm – referred to in earlier decades as the 'Singularity' – appears to be on the cusp of emerging. For increasing numbers of people, direct merger of their brain with cloud-based, non-biological artificial intelligence has become necessary in order to keep up with the truly staggering amount of new information appearing each day. These upgrades are having a significant impact on the process of learning. Personal AI can guide a person's educational progression using detailed knowledge of their brain structure and learning abilities. By the end of the century, this method of assisted learning will evolve into a system of downloads, with new skills and facts seamlessly inserted directly into a person's brain. This will ultimately lead to the end of education in the traditional sense, with a new species of transhuman emerging based on automatic, instantaneous accumulation of knowledge and vastly amplified intelligence.

2061: Halley's Comet returns. The most famous of the periodic comets, Halley's Comet last appeared in the inner solar system in 1986. Like most comets, it has a highly elliptical orbit – taking it close to the Sun for only a short time. Several unmanned probes are sent to explore it during this year, including the first robotic lander. The UK population reaches 80 million. The UK is now the most populous country in Western Europe, with more people than France (72 million), Germany (71 million) and Spain (52 million). The population of Europe, as a whole, has been declining since the 2030s. However, strong growth from immigration and a younger average population, combined with favourable environmental conditions have allowed the UK to prosper and become the leading economic power in the region. The country's ethnic makeup has changed dramatically over the last 50 years, becoming far more diverse and geographically integrated. In particular, black, and Asian persons in the most affluent areas have greatly increased. London has become a true mega-city. Its urban population (in the continuous built-up area surrounding the city) has swelled to almost 12m, while its total metropolitan area now encompasses the entire southern half of England. An extensive network of high-speed rail joins its various satellite cities. Other infrastructure being planned includes a series of tunnels spanning the Irish Sea.

2062: Nanofabricators are a mainstream consumer product. These all-purpose, desktop machines can reproduce a seemingly infinite variety of items. In effect, they are like miniature factories – more advanced versions of 3D printers seen in earlier decades. They have been around in certain military, corporate and medical environments for a while, but are now a mainstream consumer product. In appearance, they resemble a combined washing machine/microwave oven. Raw materials are purchased separately and can be loaded in solid, liquid or powder form. An interior compartment is accessed via a small hatch, where objects are constructed atom-by-atom. The process takes a matter of minutes, and the assembled items can be used immediately. New schematics can be accessed from the web and programmed into the machine.

2063: China is carbon neutral. In the early 21st century, China overtook the United States to become the world's largest source of carbon emissions, driven by rapid expansion in the use of coal and other fossil fuels. Many of its cities became notorious for their chronic pollution and extremely high levels of smog. While this rapid growth had contributed to China's emergence as a global power, its leaders recognised the need for change. The 2010s saw the adoption of policies more favourable towards clean energy and environmental protection, with dramatic improvements achieved in a relatively short time. In 2015, China became the world's largest producer of photovoltaic power, narrowly surpassing Germany. Its total installed capacity of solar photovoltaics grew from 800 megawatts (MW) in 2010 to almost 205 gigawatts (GW) in 2019, a more than 250-fold increase. Wind power, although nowhere near as fast as solar in terms of growth, nevertheless formed a major part of China's clean energy strategy. China became the largest producer of wind energy in 2010, with its total installed capacity reaching 31 GW, a figure that grew almost seven-fold to reach 210 GW by 2019. In 2020, the Chinese President, Xi Jinping, announced a long-term goal of carbon neutrality by 2060. This required China to peak its carbon emissions by 2030 with a rapid decline thereafter. China's next five-year plan (2021–2025) included measures to discourage coal use, with a transition to gas, accompanied by greater use of renewables, as well as nuclear. A nationwide carbon-emission trading scheme, the largest of its kind, also emerged. China's progress towards the 2030 target received a boost from the ongoing, inexorable decline in the cost of solar PV, as well as improvements in the efficiency and scalability of solar cells. Likewise, the cost of both onshore and offshore wind power continued to improve, with many huge farms being developed in the Yellow Sea, East China Sea, and South China Sea. This occurred in parallel with major advances in batteries for handling base loads. As flooding, droughts, storms, heatwaves, and other extreme weather disasters continued to worsen, the urgency of the climate crisis became ever more apparent during the 2030s. However, China succeeded in dramatically reducing its emissions, from a peak of around 10.6 gigatons (Gt) to just over 7 Gt by the end of the decade. The coal sector, now in a state of collapse, made way for the exponentially growing solar and wind power industries. New electric and hydrogen vehicles replaced traditional petrol-driven road vehicles, with sales of the latter being phased out by 2035. Despite this progress, considerable challenges lay ahead. Gas remained a significant part of China's energy supply, while industrial processes such as steel and cement making required the use of fossil fuels. By 2050, China had shrunk its emissions to 3 Gt. Many of the aforementioned industrial facilities underwent conversions to hydrogen or concentrated solar for high-temperature processes, such as steel rolling, along with greener forms of cement and other materials. The intensifying climate disasters now being witnessed around the world added further impetus to retrofitting or phasing out the older, aging plants and equipment. The vast majority of China's electricity now came from renewables and nuclear, while the aviation sector had seen a rapid transition to electric planes. New building codes ensured higher standards for energy conservation and heat management. Meanwhile, trees planted decades earlier during mass reforestation programs now reached maturity and provided a natural way of sequestering large volumes of CO2. As the decade drew to a close, the finishing line appeared in sight for carbon neutrality. By 2060, China's annual CO2 emissions had dwindled to just 0.2 Gt. With regulations now stricter than ever, in the face of a mounting global catastrophe, authorities in China sought to clamp down on businesses which continued to illegally use fossil fuels. Only a handful of such cases remained – typically in poorer, backwater areas of the country where small-scale remnants of legacy infrastructure from past decades could still be found. By 2063, the last of these have been identified and ordered to either adapt or shut down. In the four decades following 2020, China's efforts resulted in a cumulative reduction of 215 billion tonnes of CO2. This prevents 0.3°C (0.5°F) of global average temperature increase. The China of 2063 is now highly modernised, meeting World Health Organization standards for air quality, and well on its way to becoming the "ecological civilisation" envisioned by its government at the dawn of the century. Having transformed its energy system, China is now attempting to go even further – not only maintaining its carbon neutrality, but developing new ways of going carbon negative, with a portion of the carbon being turned into useful products. This goal is aided by the abundance of cheap, renewable energy now available, which has grown by orders of magnitude in recent decades. The megaprojects now being deployed include operations to sequester gigatons of CO2 each year, via direct air capture, to reverse the approximately 2.5 trillion tonnes of historical global emissions.

2064: Leadbeater's possum is going extinct in the wild. Leadbeater's possum – also known as the fairy possum – evolved around 20 million years BC. First discovered in southern Australia during 1867, it later became the faunal symbol of Victoria, the most densely populated state in the country. A shy, nocturnal, and fast-moving creature, it was rarely seen – occupying the highest parts of trees including the mountain ash (the world's tallest flowering plant), where it leaped athletically from branch to branch. Previously common in this region, concerns grew that it would go extinct after the draining of swamps and wetlands for agriculture in the early 1900s. Following the Black Friday bushfires in 1939 – which burned 20,000 km² (4,942,000 acres, 2,000,000 ha) of land – it was thought to have vanished. However, the animal was rediscovered in 1961, surviving in the Central Highlands about 80 km (49 mi) northeast of Melbourne. Leadbeater's possum began to recover, with numbers peaking at 7,500 by the early 1980s. They declined sharply from the late 1990s onwards, due to a population bottleneck and the combination of fire events and land use activities such as logging. The Black Saturday bushfires of 2009 – the country's worst ever natural disaster – were particularly damaging and destroyed 43% of the animals' habitat, reducing their wild population to just 1,500. Legislation passed in 2013 allowed a timber corporation access to these forests until 2040 and to be self-monitoring, a move that conservationists described as a death sentence for the remaining possums. The incoming Liberal/National coalition was similarly hostile to the environment. For millions of years, these primitive marsupials had been largely unchanged, still resembling their evolutionary ancestors in appearance and behaviour. Within a geological eye-blink, they were in danger of being wiped out completely. The so-called "old growth" trees – on which they strongly depended – covered about 4% of the Central Highlands area in 1964, but this proportion fell to 1% by 2011. As well as helping to regulate the rainfall patterns and water supply of Melbourne, this region also contained the world's most carbon-dense forests. Although protections were eventually increased, these efforts proved to be too little, too late. Alongside the ever-worsening droughts, heat and fires resulting from climate change, many decades of logging had caused fatal damage to the possum's ecosystem. These forests would not recover in time to save the species. By the middle of this decade, their population is heading for complete collapse.

2065: Longevity treatments able to halt aging. Various combinations of treatments are now available that can essentially halt the aging process, at a cost low enough for the average person. This is changing society and culture in profound ways. Rather than being a single process, aging researchers identified several distinct types of damage. As such, no "silver bullet" emerged for aging. It required a number of different approaches: 1. Junk – inside cells, 2. Junk – outside cells, 3. Cells – too few being produced, 4. Cells – too many being produced, 5. Mutations – chromosomes, 6. Mutations – mitochondria, 7. Protein crosslinks. In the early years of the 21st century, it was possible to extend the human lifespan by only two months per year. In other words, for every year of a person's life they lived, two months of additional life could be expected from advances in science and medicine. However, subsequent decades witnessed a revolution in medicine, with major advances in the use of stem cells, gene therapy, 3D printing of body parts, nanotechnology, and other techniques. Ever more sophisticated, powerful, and compact devices gained the ability to scan, identify and treat the most elusive of bodily defects at scales previously thought impossible. Exponential progress, aided by the prevalence of deep learning and other AI techniques, enabled the "longevity escape velocity" to edge closer and closer – first in mice, then later in monkeys, and finally in humans, with 12 months per year of additional lifespan being added. By the 2040s, this allowed some celebrities and other high-income individuals to remain in a relatively young and biologically healthy state. By the mid-2060s, cost reductions are combining with expiration of patents, and further improvements in research, to enable the majority of the world's population to benefit. As with previous revolutions in science, debates have raged over the ethics and implications of an end to aging. However, there is generally strong support from the public, due to the improvements in quality of life (health span) and the potential to live a vastly extended life with all the experiences and opportunities that brings. Self-assembling buildings made 100% from nanotech. Nanotechnology – the control of matter on an atom-by-atom basis – has swept the world, transforming society in myriad ways. At the same time, new methods of automation are displacing the need for human labour on ever increasing scales. A growing number of industries have seen their workforces shrink dramatically, with robots and AI handling the bulk of operations. Around the world, unemployment has soared. Construction companies are being particularly affected now. By the middle of this decade, it's becoming possible to build entire homes and offices using nanotechnology alone. For a typical square or rectangular plot of land, this takes the form of self-assembling machinery, based around a scaffold system that initially resembles a giant, four poster beds. Vertical columns, one in each corner of the site, support a platform that gradually rises from the ground, adding successive layers of material beneath it. The columns rise in tandem with the platform, whilst also relaying material, until the building is finally topped out. In effect, these machines are like substantially bigger versions of 3D printers and nanofabricators. For some of the more "unique" building designs or features, traditional methods of construction and engineering are still incorporated. Even these will eventually be replaced by self-assemblers as the technology advances further. Atom by atom, these intelligent machines lay the foundations, core, framework, flooring, electrics, doors, and other components – while robots inspect the interior, perform safety checks, and adjust where necessary. By the 2070s, even skyscrapers and other tall structures can be erected using this method. The process is so rapid, it takes a matter of days from groundwork to final completion. Humans are rarely if ever needed on site. Archival Discs are becoming unreadable. Archival Disc was a successor to the Blu-ray format, commercially introduced in 2015. In addition to a much larger storage capacity (initially 300 GB, later expanded to 1 TB), it also featured a longer lifespan. Various factors were known to affect the read/write quality of magnetic media – such as temperature, humidity, dust and other conditions, frequency of use and compatibility between disc and device. Archival Disc was designed to maintain readability for at least 50 years. By 2065, the first generation of these Archival Discs are becoming degraded. Any data on this storage medium that has not been backed up or transferred to an alternative format will now be lost. Insurance crisis. Damage wrought by accelerating climate change has led to most insurance firms filing for bankruptcy. In the United States, widespread flooding has resulted in hundreds of billions of dollars’ worth of damage. Coastal cities are particularly badly hit. Much of the infrastructure in the southern states has been destroyed by category 5 hurricanes, with Houston and New Orleans lying virtually abandoned. Along the west coast, gigantic fires spread by the tinder-dry ground have ravaged much of the land. The economy of California is in tatters. Many of the biggest insurance firms have been nationalised by the government in a bid to avert economic collapse.

2067: The first generation of antimatter-powered spacecraft is emerging. A hundred years have passed since humans first ventured into space. For much of that time, manned craft were limited to the Earth-Moon system with only small, incremental advances in propulsion systems. After the legendary Apollo missions, it had seemed like anything was possible – even travel to the stars. But disappointment followed, as the Space Race ended, and priorities shifted elsewhere. The goal of colonising the Moon, putting men on Mars, and exploring the outer Solar System became a distant prospect: relegated to the realm of science fiction. As the early years of the 21st century unfolded there was a perception among many that this trend would continue. A number of setbacks reinforced this view – such as the retirement of the Space Shuttle, the cancellation of NASA's Constellation program and the relative lack of excitement around the International Space Station, along with an emerging financial crisis. In reality, however, great strides were being made in a number of areas. For a start, information technology was growing at an exponential rate; a pattern that had remained consistent for many decades and showed little sign of slowing down. Computer processing power, memory, data storage, bandwidth and a host of other measures were doubling in performance every 12-18 months, whilst declining dramatically in cost. This greatly accelerated the pace of research and development, as knowledge could be shared quickly and easily around the world. Billions of people gained access to the World Wide Web, fostering education and innovation on an unprecedented scale. Previously restricted to government agencies, space began to open up, becoming commercialised and industrialised. Entrepreneurial efforts by wealthy individuals led to a thriving market for space tourism, while crowdfunding and other creative options gave rise to many smaller-scale enterprises. The emergence of new players such as China and India further helped in reinvigorating space research. As the decades passed, a new generation of rockets was developed. Materials based on nanotechnology enabled stronger, lighter, and cheaper spacecraft. Artificial intelligence was another by-product of the information revolution, enabling systems to effectively design themselves. By the middle of the century, launch costs had been reduced by orders of magnitude. Alongside all of this, many important breakthroughs were made in the understanding of scientific processes and physical phenomena. Among the most significant of these was in antimatter production and confinement. In 2010, particles of antimatter were trapped for the first time at CERN in Geneva. Researchers produced, trapped, and then released a few dozen atoms of antihydrogen for around two-tenths of a second. The following year, this feat was achieved again but for 17 minutes – nearly four orders of magnitude longer than before. With stupendously high energy density (roughly 10 billion times more powerful than chemical reactions such as hydrogen and oxygen combustion), antimatter held potential as the ultimate source of spacecraft propulsion. Unfortunately, it was extraordinarily difficult and expensive to produce, with a few grams costing trillions of dollars and total production from 1950 to 2010 being just 10 nanograms. However, scientific, and technological progress in the early-mid 21st century was occurring at an exponential rate. Anti-proton production began to increase substantially, aided by ever-more sophisticated models and simulations, together with AI programs that were beginning to match – and even exceed – human intelligence. This happened in parallel with rapid advances in engine design, materials science, and fusion power. By the late 2060s, the first prototype antimatter-powered spacecraft is demonstrated. The "fuel" for this vessel consists of tiny pellets containing deuterium and tritium – heavy isotopes of hydrogen with one or two neutrons, respectively, in their nuclei (hydrogen normally has no neutrons). Inside each pellet, this fuel is surrounded by uranium. A beam of anti-protons, with an electrical charge of minus 1, is then fired at the pellets. When the anti-protons collide with the uranium nuclei, they annihilate, generating vast amounts of energy which triggers fusion reactions in the fuel. This provides thrust via magnetic confinement and a magnetic nozzle. Using this propulsion system, a trip to Jupiter can be achieved in just four months, using 1.16 grams of anti-protons. By the 2070s, a number of crewed missions are being conducted. Further advances in antimatter and ship designs pave the way for interstellar travel in the 22nd and 23rd centuries. Fully automated container vessels at 50,000 TEU. In the 1960s, shipping companies had begun to increase the use of container boxes. These allowed the bundling of cargo and goods into larger, unitised loads that could be easily handled, moved, and stacked, and that would pack tightly into a ship or yard. As the economy expanded and society became more globalised, container boxes grew more and more numerous, with a consequent demand for larger and larger sea-going vessels. The industry used a measurement known as TEU, which stood for "twenty-foot equivalent unit" – a reference to the rectangular container boxes with dimensions of 20' x 8' x 8'. A cargo ship with capacity of 20,000 TEUs, for example, could fit 20,000 shipping containers of that size into itself. In the early 1970s, the biggest ships held approximately 2,000 TEU. In subsequent decades, the carrying capacities increased substantially. By 2010 they had reached 15,000 TEU and by 2020 they regularly exceeded 23,000 TEU, an order of magnitude difference compared to 50 years previously. A notable incident occurred in 2021 when the Ever Given – one of the world's largest container vessels – ran aground in the Suez Canal, disrupting nearly $10bn of global trade every day for a week. Berthing fees divided ships into 100 metre brackets. For many years, this had kept their designs to just below 400 metres – resulting in wide, bulky, and vertically stacked vessels. However, the enormous and growing volumes of cargo requiring transportation meant that eventually, this ceiling would need to be exceeded for practical and safety reasons. As such, the first container vessels reaching half a kilometre in length began to emerge, their additional capacity offsetting the increase in berthing fees. In addition to length, ships continued to grow in width and draft level. Emerging markets in southeast Asia and elsewhere now had middle classes with improved per capita incomes and living standards, demanding more and more consumer items from around the world. By the late 2060s – a full century after their introduction – container vessels have reached truly enormous sizes. The largest are now exceeding 50,000 TEU or more than double their cargo volume in 2020. This has necessitated major upgrades of ports, such as the redesign of cranes and other terminal infrastructure. Gigantic, floating dry docks, with cranes on both sides that load or unload cargo from one ship simultaneously, are a common sight at coastal cities of the 2060s. Another major development is the widening of canals and other routes. Not only are these ships much larger, but they are also fully automated. Ports with automated facilities had already emerged in the 2020s and the ships themselves are now AI-controlled too. The days of piracy are long gone, since the vessels can operate entirely by themselves, preventing any possibility of hostage taking. Automated defences, such as armed drones, serve to further dissuade any would-be attackers. Any maintenance required during a voyage can be performed by robots as well. Another improvement is that the entire global shipping fleet is now 100% sustainable, as the older ships have all been decommissioned, while newer ships conform to international standards on emissions. Male and female salaries are reaching parity. In the developed world, the gender gap has narrowed to such an extent that salaries and rights are pretty much equal for both sexes. Women are now playing a greater role in business and government than ever before. Just one consequence has been a significant reduction in military spending. The money and resources saved are being diverted to education, healthcare, transport, and environmental programmes, improving the living standards and opportunities for many. With less male aggression in world affairs, more balanced and level-headed discourse is taking place on international issues. Widespread use of AI in neutral, objective, consultative roles is also reinforcing cooperation by providing more "logical" solutions to global issues.

2068: A major landmark in the world of athletics Improved lifestyles and training techniques – including the use of VR for mental enhancement – saw many world athletics records continue to fall during the last few decades. At the 2068 Olympics, a major landmark is passed when a black male athlete completes the 100-metre sprint in less than nine seconds. However, these records will soon be hitting a barrier as it becomes physically impossible for humans to run any faster without biotechnological aids. Indeed, a new breed of "super athlete" has emerged, as the authorities have legalised certain implants, drugs, and muscle-enhancing devices. This has resulted in a splitting of the games into three separate events – a "classic" group for natural, unenhanced athletes; Paralympics for those with disabilities; and a third "cyber" category for those with biotechnology enhancements. The Paralympics will eventually disappear altogether as literally all physical disabilities are overcome.

2069: Underground habitats are commonplace. By the mid-21st century, severe climate change gripped much of the world. As the global average temperature rise now exceeded 3°C and edged closer to 4°C, desperate measures were taken by governments, businesses, and citizens alike to adapt to a rapidly worsening environment. In addition to a gradual phasing out of fossil fuels, various carbon capture and storage methods were employed. This included natural solutions in addition to technological innovations. Massive tree-planting efforts, for example, allowed many cities and towns to achieve local cooling. Meanwhile, a shift away from traditional agriculture and in favour of cultured meat products, alongside plant-based alternatives, further reduced humanity's impact. However, despite the ubiquity of clean energy and other progress towards carbon neutrality, society remained in a fragile and perilous state. While the level of atmospheric carbon dioxide (CO2) and other greenhouse gases had now peaked and begun to fall, emissions from earlier decades were "locked in" to the system. Gargantuan amounts of heat lay trapped in the oceans, for example, which had soaked up 90% of global warming, while feedback loops had been triggered and some appeared irreversible for centuries. Rather like attempting to change the course of an oil tanker, the climate system was slow to respond to humanity's more eco-friendly direction. Significant portions of the Earth's surface were being rendered uninhabitable – due to temperatures passing the limits of human endurance, on top of extreme droughts, flooding, and hurricane activity. Having been a somewhat minor and distant threat in the first half of the 21st century, the reality of rising sea levels now emerged as one of the most urgent and serious issues facing the world. The looming spectre of nuclear conflict returned as some nations fought over territory and resources, such as India and Pakistan in the disputed Kashmir region. Amid the growing geopolitical chaos, mass migrations of refugees became the norm, particularly from Africa and the Middle East. Climate change served to exacerbate global inequality. A hyper-rich elite, now including a number of trillionaires, sought to isolate themselves and their interests by creating entire new micronations. While some chose to invest in "closed cities" with restricted access and military protection, others looked to the oceans to build enormous floating islands and undersea habitats. As the world entered the second half of the 21st century, an emerging ecosystem of space-based habitats provided yet another means of escape. By the late 2060s, a fourth option gaining widespread commercial and technical viability is underground towers, often referred to as "earth scrapers". These can reach hundreds of metres below the Earth's surface, utilising new materials and structural techniques in combination with self-sufficient power generation and production of food/water, fully automated recycling, heat management and so on. Some facilities are even powered by fusion, while others have access to geothermal energy if placed in the appropriate geographical location. Subterranean gardens, sports and entertainment, virtual reality and other features ensure that occupants are kept stimulated and free of claustrophobia. Unlike the overground micronations, most of these underground habitats provide the advantage of being essentially immune to nuclear war, as well as various other global threats. This physical robustness makes them increasingly attractive as long-term investments – but also for general security, privacy, and improved business operations in the short to medium term. While some communities opt for an isolated existence, away from the chaos and upheaval above ground, others are more open to visitors. As the number of residents in earth scrapers reaches into the tens of millions, they are joined by a network of hyperloop tunnels, linking countries and continents. Many of the design principles in these habitats are later employed on the Moon and Mars. Later still, they are adapted for colonies in hollowed-out asteroids.

2070: Global average temperatures have risen by 4°C. Vast stores of methane, released from melting permafrost, have triggered an abrupt change in the Earth's climate. The atmosphere has now shifted to pre-glacial/interglacial conditions which last prevailed over 34 million years ago. CO2 levels have reached almost 700 parts per million – two and a half times pre-industrial levels. This has resulted in a global average temperature increase of 4°C, with the Arctic region seeing rises as high as 15°C. In many parts of the world, the limits for human adaptation are being exceeded. Despite attempts to share food and resources between nations – and to accommodate the surge in refugee numbers – the sheer scale of this disaster is presenting enormous challenges, even with the technological base of the 2070s. The use of heavily modified GM crops, hydroponics, desalination, and other techniques have allowed some regions to maintain a degree of stability. Nanofabricators are also being utilised in the more advanced societies. For many others, however, it's becoming impossible to sustain any kind of agriculture at all, due to the water loss, soil depletion and other environmental impacts now being experienced. The intensity of freak weather events has increased dramatically, with hurricanes and severe storms, extreme flooding and droughts becoming widespread. A number of countries near the equator have been abandoned, their people scattered. City-scale flooding disasters are now commonplace as sea levels have risen a full metre, sweeping away trillions of dollars’ worth of real estate. The number of displaced persons is overwhelming the ability of international organisations and governments to cope. Although many refugees are surviving and resettling in higher or lower latitudes, even greater numbers are unable to complete the journey, or are denied border entry, resulting in alarming numbers of deaths from hunger, conflict, and adverse environmental conditions. Traditional free market capitalism is facing enormous pressures and upheaval, as civilisation struggles to adapt to this new and rapidly changing world. Resource-based economies are evolving to take its place. For too long, humans exploited their environment with little appreciation of long-term consequences. Nature is finally beginning to redress the balance. Fusion power is widespread. Most leading countries now have at least one fusion plant either commercially operational, or in the process of construction. These reactors offer a clean, safe, and abundant supply of energy. Fully automated homes. Buildings in developed nations have become highly automated and self-sufficient. In addition to robots, a typical new home now includes the following: A localised power supply. Energy can be generated by the building itself, via a combination of photovoltaics and piezoelectric materials. Walls, roofs, and windows can absorb almost all wavelengths of light from the Sun with organic solar technology, turning it into heat and electricity. Friction generated by the occupant's footsteps – and various other kinetic processes – can also produce energy. This is converted and stored in any number of ways, from hydrogen to batteries. In countries where sunlight is less frequent, microturbines may be used in place of solar. On-site water production and waste management. Rain is captured by external guttering, then stored and converted into drinking water using nanofiltration systems. This is especially useful in regions prone to drought (which includes a substantial portion of the world by this time). If local water is in short supply, houses can serve as miniature reservoirs and filtration systems. Meanwhile, plastics and other kitchen waste can be placed in recycling machines, ground into extremely fine powder, then later re-used in nanofabricators. A multi-layered building envelope which provides a variety of dynamic effects. Windows can self-adjust their size and position – as well as their opacity – to optimise the level of natural light. In some of the more upmarket properties, the entire façade can morph its texture and appearance. Depending on the tastes of the occupant, this could transform into an art deco style, a classic Victorian building, or something entirely different. This form of "programmable matter" can even be designed by the occupant themselves and changed on demand. Air purification systems. Air within the home is kept fresh, purified and completely free of dust and microbes. Interactive surfaces. Holographic generators cover the whole interior of the property – including walls, doors, worktop surfaces, mirrors, and shower cubicles. These intelligent surfaces can track the position of the occupant and display information whenever and wherever necessary. A person can read emails, see news reports, and access the online world using virtually any surface in the house as a touch screen or mind control interface. Detailed, real-time information on their health, personal lifestyle and daily schedules can also be displayed. This system has a variety of other functions, e.g., it can be used to locate personal items which may have been misplaced. Intelligent/self-maintaining appliances. Appliances that don't repair or maintain themselves in some ways have become largely obsolete by now. It is very rare for a human engineer to be called to the house. A modest size. The world is becoming an ever more crowded place, with available land continuing to shrink due to overpopulation and environmental decline. In city centres, apartments tend to be highly minimalist and compact, with small footprints utilising every inch of space. Full immersion virtual reality is one method of adapting to this. Another is flexible room layouts that reconfigure themselves on demand. In earlier decades, this was achieved in some homes by using a sliding wall system. Today, it can be done with morphable materials. Five-year survival rates for liver cancer are approaching 100%. In the early 21st century, liver cancer was the third most common cancer death in the world. Nearly 700,000 people died from the disease in 2008, accounting for 9% of all cancer deaths. Major risk factors included chronic infection with hepatitis B and C (accounting for 54% and 31% of cases, respectively), consumption of foods contaminated with aflatoxin, and heavy alcohol consumption. It was nearly three times more common in men than in women. In 2009, Japanese researchers began efforts to map the complete genome of liver cancer. This paved the way for blood tests to spot tumours earlier, whilst also yielding new drug targets. The increasing use of nanoparticle carriers – and eventually nanobots giving precise control and delivery of drugs – also greatly improved survival rates. Despite the global chaos unfolding at this point in history, scientific knowledge continues to advance incrementally. By 2070, five-year survival rates for liver cancer are reaching 100% in many countries.

2072: Advanced nanotech clothing. Decades have now passed since the first appearance of nanotech clothing. During that time, it has made extraordinary improvements in utility, power, and sophistication. Modern fabrics have built upon the abilities of previous generations, perfecting many of the technologies involved. Today, a complex blend of nanotechnology, biotechnology, claytronic, metamaterials and other components has yielded a type of clothing previously confined to the realm of science fiction. Though mostly restricted to specialised personnel, government forces and elites, a number of these suits are finding their way into the mainstream. Construction via self-assembling nanotechnology has been around for a number of decades. Until now, the process was only practical using bulky and/or conspicuous machinery, nanofabricators, or objects suspended in tanks of catalytic fluids. However, recent advances in nanorobotics have allowed for more subtle and rapid construction of macro-scale objects in a more compact form-factor and with less impact on Earth's natural resources. As happened with early nanotech adoption in the 2020s, one of the easiest and most common applications has been in fabrics. Today, a high-end home "closet" may consist of simply a thin surface or pad built into the wall or floor, concealing a mass of nanobots and molecular building materials. A user can stand on or touch this surface and issue instructions to the machine (through voice command or virtual telepathy) for what to create. Each nanobot is then programmed with the final clothing design and set into motion. The process begins with each nanobot organising and categorising each building molecule, based on the aggregate material needed and where each piece will be located in the finished product. The nanobots – also called "foglets" – then begin interlocking with themselves, forming a basic "skeleton" on which building molecules can be attached. As more and more nanobots and molecules are added on, thousands of individual fibres begin to form out of the machine's surface. These grow up and around the person's body, crossing each other to create a weave pattern, before finally taking the shape of traditional clothing. The result is a basic structure around which nanobots then construct the more advanced and customised features. Depending on the outfit's function, the original fibres can be interlaced with photovoltaics, piezoelectric nanowire, carbon nanotubes, metamaterials, claytronic or any number of other useful materials. Tiny electronic devices can be added for communication or medical purposes. This whole process is completed in a matter of seconds. With such detail and control, fabric of this nature confers the wearer an array of conveniences. In earlier decades, this technology was limited to relatively simple functions, like colour and texture modifications. Today, it is almost indistinguishable from magic. Complete wardrobes are no longer necessary, since one garment performs the function of many, transforming into an endless variety of styles and shapes. Most outfits are self-cleaning, self-fragrancing and rarely if ever need to be washed. They can instantly adjust themselves in emergencies – becoming harder than steel to stop a knife or bullet; cushion-like in the event of accidents or falls. If a person is injured, the fabric can administer life-saving drugs and medical nanobots, or contract to seal a wound. A drowning person can be made safe. Fire-fighters and other rescue workers are completely protected from hazards such as fire or radiation. This is also useful in space, protecting people from sudden changes in air pressure, micrometeorites, cosmic rays, and other hazards. Medical devices included in these outfits monitor for disease at all times, catching the earliest signs of cancer or infection and alerting the wearer before any damage is done. Whatever power is needed for the various functions is supplied by a combination of piezoelectric and photovoltaic components embedded throughout the clothing material. Some of these aforementioned comforts had already been available in earlier decades but were simpler and fewer – usually limited to just one, or a small number within each item of clothing. Today, however, all of them can be fully integrated and combined together into a single suit, created, and maintained via swarms of intelligent foglets. As this technology evolves further, it becomes a permanent part of some peoples' physiology, almost like a second skin. Pico technology is becoming practical. Technology on the scale of trillionths of a metre (10-12) is becoming practical now. Known as "Pico technology", this is orders of magnitude smaller than nanotechnology of earlier decades. Among other applications, it allows the structure and properties of individual atoms to be altered via manipulation of energy states within electrons. This can produce metastable states with highly unusual properties, creating new and exotic forms of matter.

2073: Plastic recycling rates are approaching 100% worldwide. By the early 2070s, plastic recycling is ubiquitous globally. Although some of the most rural and isolated areas still lack the required infrastructure and facilities, they now represent a negligible percentage. Belgian chemist Leo Baekeland invented the world's first synthetic plastic, Bakelite, in 1907. Improvements in chemical technology led to an explosion of new types of plastic, with mainstream adoption beginning in the 1940s and 50s. Production expanded at a phenomenal rate during the second half of the 20th century – from two million tons annually in 1950, to more than 200 million tons each year by 2000. With no end in sight to this upward trajectory and in light of concerns over its slow decomposition after disposal, as well as its toxicity, recycling emerged as a solution from the 1980s onwards. This was more environmentally friendly than incineration, or waste-to-energy methods, which had also begun some years before. However, even with both recycling and waste-to-energy systems in place, handling the mountains of plastic waste generated each year was proving to be a formidable challenge. Researchers noted the appearance of a "Great Pacific garbage patch", a gyre of marine debris in the north central Pacific Ocean estimated to contain over 1.8 trillion plastic fragments. By the early 21st century, this was having a substantial impact on seabirds, fish, and other life in the food chain with implications for human health too. In 2019, plastic even appeared at the bottom of the Mariana Trench, during a deep ocean expedition. Scientists calculated that plastic waste could exceed fish biomass by 2050, unless major international efforts reversed this trend. More ominously, plastic pollution was found to harm the growth, photosynthesis, and oxygen production of Prochlorococcus – the ocean's most abundant photosynthetic bacteria – responsible for 10% of oxygen breathed by humans. This bleak picture was reinforced by wildlife television documentaries, such as those narrated by the popular naturalist David Attenborough, broadcasting the uncomfortable reality to millions of viewers. The public was becoming more and more awake to the scale and urgency of this crisis. Recycling proved to be surprisingly popular, with a growing number of countries willing to introduce policies. In addition to routine collections of household waste by local governments, more ambitious measures were being adopted at regional and national levels. Gradually, the percentage of discarded plastic began to fall, while the recycled proportion increased. This trend continued through the 21st century. Japan was among the world leaders in plastic recycling. Its plastic waste utilisation rate increased from 39% in 1996, to 73% in 2006 and then 90% by the end of the 2010s. This figure approached 100% by the mid-2020s, making Japan one of the first nations to recycle essentially all of its plastic waste. Key to Japan's success was the passing of several recycling laws from the late 1990s onwards, mandating businesses and consumers to separate plastic waste, along with public awareness and education campaigns on the benefits of recycling. This occurred in response to a shortage of landfill space, with Japan being a relatively crowded nation of high population density. New technology also helped to convert an increasingly wide range of waste products into reusable items, such as PET bottles. European Union (EU) member states were making progress too, with more than 40% of their plastic packaging waste recycled in 2016, easily surpassing the EU's minimum target of 22.5%. Among EU member states, the Czech Republic ranked on the top with a recycling rate of 51% in 2014, followed closely by Germany, the Netherlands, Sweden, and Ireland. The European Circular Economy Package (CEP) targets were set at 50% by 2025 and 55% by 2030 – alongside a goal of 100% for packaging, specifically – with a ban on various other single-use products made of plastic where alternatives existed. In the UK, charges were introduced for plastic carrier bags, leading to a nearly 90% drop in single-use bags. The manufacture and sale of cosmetics and personal care products with microbeads (solid plastic particles of less than a millimetre in their largest dimension) was banned, followed by plastic straws and cotton buds. Dozens of companies, including supermarket giant Asda, signed up to the UK Plastics Pact aiming to cut plastic pollution by 2025. Meanwhile, the government pledged to eradicate all "avoidable" plastic waste throughout the country by 2042. In the United States, no federal law existed to mandate recycling, with state and local governments introducing their own requirements. Only around 10% of plastic was recycled as of 2020, but a long-term plan by the American Chemistry Council proposed 100% of plastics packaging to be re-used, recycled, or recovered by 2040. To achieve this, plastic resin producers would focus on six key areas: designing new products for greater efficiency, recycling and reuse; developing new technologies and systems for collecting, sorting, recycling and recovering materials; making it easier for more consumers to participate in recycling and recovery programs; expanding the types of plastics collected and repurposed; aligning products with key end markets; and expanding awareness that used plastics are valuable resources awaiting their next use. With increasing pressure from both consumers and governments, hundreds of the world's leading packaging brands committed to ensure that 100% of their plastic packaging could be reused, recycled, or composted by 2025. This momentum was sustained into the 2030s and beyond, spreading to recently developed economies with higher incomes, now able to afford the necessary waste management infrastructure. Regions in Africa, the Middle East and Southeast Asia were rapidly catching up with the West and viewed environmental protection as a higher priority than before. In the 2010s, some had already begun to turn away foreign waste shipments, dumped on them from far away. An increase in public recycling bins, plastic bottle banks, reverse vending machines (with voucher or cash incentives), office workplace recycling schemes and other such measures all contributed to the ongoing, upward trend in global recycling. At the same time, new production methods were allowing bioplastics to be manufactured without the need for fossil fuels and to biodegrade easily. Thanks to advances in science and technology, this was becoming possible even for some types of plastic that had traditionally been regarded as extremely difficult to break down, convert and reuse. Automation, robotics, AI, and machine learning helped to improve the filtering and sorting capabilities at waste treatment plants. However, even low-tech solutions were able to combat the plastic problem: supermarkets in Thailand and Vietnam, for example, used banana leaves as a packaging alternative. By 2050, most countries had either substantially reduced or even eliminated plastic entering landfills. This left the problem of incineration, which had grown in parallel with recycling at a similar rate during the late 20th and early 21st century. New regulations and international treaties to restrict emissions from plastic incineration were combining with even greater improvements in waste collection and separation. This led to the percentage of recycled plastic overtaking that of incinerated material. Networks of orbiting satellites – equipped with powerful sensors and ultra-high resolution and zoom capabilities – kept a watchful eye on ground activities, to monitor emissions and ensure compliance. Gradually, plastic waste incineration disappeared from both developed and developing countries as the economic, social, and environmental benefits made more and more sense. Other advances have emerged in recent years. Among them are a new generation of domestic appliances for dealing with household waste, now able to deconstruct and recycle a seemingly infinite variety of plastic products. Inexpensive, compact, and easy to use, these are typically desktop machines in the same form-factor as a 3D printer, offering the dual functions of assembly and disassembly. As well as being popular in the home, they are found in many public venues and workplaces. Alongside these high-tech machines are regular drone and robotic patrols in towns and cities, which can identify and pick up discarded litter – often within a matter of minutes. The world of 2073 is a cleaner and tidier place. Challenges remain outside human-inhabited areas, however. The world's oceans, for example, remain in a dire state, being much more difficult to access and taking longer to restore. Those fragments of plastic that lie undiscovered will persist in ecosystems for another 500 years. The number of trillionaires in the world exceeds 10. The world's first trillionaire had emerged in the 2030s. The entire top 10 of the Forbes rich list is now composed of such individuals. Due in part to inflation, 20% of the global adult population now possesses a net worth of US$1 million or more. The rich have also become younger, more female, and less Western.

2075: The first space elevator is becoming operational. The idea of a space elevator had been around as early as 1895, when Russian scientist Konstantin Tsiolkovsky first explored the concept. Inspired by the newly built Eiffel Tower, he described a free-standing structure reaching from ground level into geostationary orbit. Rising some 36,000 km (22,000 mi) above the equator and following the direction of Earth's rotation, it would have an orbital period of exactly one day and thus be maintained in a fixed position. A number of more detailed proposals emerged in the mid-late 20th century, as the Space Race got underway and manned trips to Earth orbit became increasingly routine. It was hoped that a space elevator could drastically reduce the cost of getting into orbit – revolutionising access to near-Earth space, the Moon, Mars and beyond. However, the upfront investment and level of technology required meant that such a project was rendered impractical for now, confining it to the realm of science fiction. By the early decades of the 21st century, the concept was being taken more seriously, due to progress being made with carbon nanotubes. These cylindrical molecules offered ways of synthesising an ultra-strong material with sufficiently high tensile strength and sufficiently low density for the elevator cable. However, they could only be produced at extremely small scales. In 2004, the record length for a single-wall nanotube was just 4 cm. Although highly promising, further research would be needed to refine the manufacturing process. It was not until the 2040s that material for a practical, full-length cable became technically feasible, with the required tensile strength of 130 gigapascals (GPa). Even then, design challenges persisted – such as how to nullify dangerous vibrations in the cable, triggered by gravitational tugs from the Moon and Sun, along with pressure from gusts of solar wind. Major legal and financial hurdles also needed to be overcome – requiring international agreements on safety, security, and compensation in the event of an accident or terrorist incident. The insurance arrangements were of particular concern, given the potential for large-scale catastrophe if something went wrong. In the interim, smaller experimental structures were built, demonstrating the basic concept at lower altitudes. These would eventually pave the way to a larger and more advanced design. By the late 2070s, following 15 years of construction, a space elevator reaching from the Earth's surface into geostationary orbit has become fully operational. The construction process involves placing a spacecraft at a fixed position – 35,786 km (22,236 mi) above the equator – then gradually extending a tether down to "grow" the cable towards Earth. It also extends upwards from this point – to over 47,000 km (29,204 mi) – a height at which objects can escape the pull of gravity altogether. A large counterweight is placed at this outer end to keep it taut. Locations that are most suitable as ground stations include French Guiana, Central Africa, Sri Lanka, and Indonesia. As with most forms of transport and infrastructure in the late 21st century, the space elevator is controlled by artificial intelligence, which constantly monitors and maintains the structure throughout. If necessary, robots can be dispatched to fix problems in the cable or other components, from ground level to the cold vacuum of space. This is rarely required, however, due to the efficiency and safety mechanisms in the design. A major space boom is now underway, as people and cargo can be delivered to orbit at vastly reduced costs, compared with traditional launches. Over 1,000 tons of material can be lifted in a single day, greater than the weight of the International Space Station, which took over a decade to build at the start of the century. Although relatively slow – taking many hours to ascend – the ride is much smoother than conventional rockets, with no high-G forces or explosives. Upon leaving the atmosphere and reaching Low Earth Orbit, between 160 km (99 mi) and 2,000 km (1,200 mi), cargo or passengers can be transferred to enter their own orbit around Earth. Alternatively, they can be jettisoned beyond geosynchronous orbit, in craft moving at sufficient speed to escape the planet's gravity, travelling onward to more remote destinations such as the Moon or Mars. In the decades ahead, additional space elevators become operational above Earth, the Moon, Mars and elsewhere in the Solar System, with a considerable reduction in costs and technical risks. Construction is also made easier by lower gravity: 0.16 g for the Moon and 0.38 g on Mars. Further into the future, space elevators are rendered obsolete by teleportation and similar technologies. The distribution of birds in the United States has been altered substantially. By 2075, a combination of global warming, land use and land cover changes has led to substantial environmental impacts in the United States. This has caused disruption to many animal species in terms of how they live, feed and breed. The habitat ranges of birds have been particularly affected. Not all of these changes have been negative, as some species have actually benefited and seen major increases in their traditional ranges. However, despite the extra room for expansion, these winners now face renewed competition and territorial contests from other species, as well as new predators. Generally speaking, higher temperatures have pushed the overall range of species further north. Desert-dwelling birds now have increased habitat, while those preferring cold weather conditions have seen a decrease. In contrast to the broad, sweeping influence of climate change, the effects of landscape changes resulting largely from human activity have been more scattered and focussed, with a very high loss of habitat at certain local and regional scales. Urban growth, deforestation, mining, and other resource use has been at least as impactful on ranges as climate change. Many familiar sounds and spectacles of birds that were previously common in backyards and countryside walks have disappeared, while new and strangely unfamiliar ones have appeared to replace them. Among the bird ranges seeing the greatest expansions are those of Gamble’s quail (+62%), the cactus wren (+54%), scissor-tailed flycatcher (+46%), Gray vireo (+45%) and painted bunting (+39%). Conversely, many state birds and mascots are vanishing now including the Baltimore oriole, brown pelican of Louisiana, common loon of Minnesota and others. The iconic bald eagle has lost nearly 75% of its range. The Thames Barrier is upgraded. London is the latest of many cities to upgrade its flood defences in the wake of devastating storm surges and sea level rises. The original barrier was raised a total of 62 times between 1983 and 2001. It was raised with increasing frequency as the decades went by. Towards the end of this century its successor will need to be raised over 200 times each and every year.

2076: Accurate simulations of viruses. Thanks to advances in lattice quantum chromodynamics, the vast majority of viruses have now been accurately modelled and simulated down to the quantum level. This provides what is essentially a complete picture of these tiny organisms, which average 20-300 nanometres in size, with millions of different types. As this area of science continues to advance, ever larger and more detailed simulations become possible, providing new insights into the nature of matter. Unmanned probes to Sedna. Sedna is a trans-Neptunian "dwarf planet", similar in size and composition to Pluto. Discovered in 2003, it became the most distant object yet observed in the Solar System, and the largest solar body to be found in over 70 years. Its orbit is highly elliptical, going from 76 AU to about 975 AU over the course of 12,000 years. In 2076, it reaches perihelion (its closest point to the Sun) and unmanned probes are sent to explore it.

2078: The world's electricity has been fully decarbonised. Electrical phenomena had been studied since antiquity, but progress in theoretical understanding remained slow until the 17th and 18th centuries. Even then, practical applications for electricity were few. It would not be until the late 19th century that engineers were finally able to put it to industrial and residential use. The rapid expansion in electrical technology at this time transformed society and was a driving force behind the Second Industrial Revolution. The extreme versatility of electricity provided an almost limitless set of applications including transport, heating, lighting, communications, and computation. It became the backbone of modern industrial society. In the late 1870s, cities began installing large-scale electric street lighting systems based on arc lamps. After the development of a practical incandescent lamp for indoor lighting, Thomas Edison switched on the world's first public electric supply utility in 1882. Subsequent discoveries and inventions would include the first electromagnetic waves (1888), the fuse (1890), wireless telegraphy systems (1894), a radio receiver to detect lightning strikes (1894), X-Rays (1895), the first successful intercontinental telegram (1896), the first cathode ray oscilloscope (1897), fluorescent lamps (1901), diodes (1904), triodes (1906) and superconductivity (1911). The 1920s saw rapid growth in demand for domestic applications of electricity. Public exhibitions featured "homes of the future", and in 1926, the Scottish inventor John Logie Baird gave the first demonstration of a television system in London, launching a revolution in communication and entertainment. By the 1930s, around 50% of people in the United Kingdom and 70% in the United States had access to electricity and these figures reached almost 100% within the next 30 years. The Second World War saw tremendous advances in electronics, especially in radar and other aircraft systems. Following the war, Bell Telephone Laboratories developed the first working transistor, in 1947. With rising household incomes, consumer devices became more and more widespread and included vacuum cleaners, refrigerators, washing machines, dishwashers, air conditioning, colour TV, hi-fi, and other audio systems and latterly computers and microprocessors. Until relatively recently in world history, humans had lived simple and slower-paced lives, in far smaller numbers. The Industrial Revolution marked the beginning of profound change, with a population explosion and previously unimaginable technology. To power the machines, vehicles, lighting, and everything else transforming society required a vast amount of electricity. In the 19th and 20th centuries, the easiest and cheapest way to obtain this power was from the ancient stores of carbon-based materials that lay buried in the ground: fossil fuels. While many different subtypes existed, fossil fuels could be generally divided into three main variants – coal, oil, and gas. The first of these had been around for millennia in, for example, smelting and domestic heating, but became far more important during the Industrial Revolution with larger-scale applications. The first coal-fired power stations producing electricity for public use became operational in the 1880s. With enormous demand for electricity, and centuries' worth of mining reserves, a rush to acquire this valuable resource got underway. Likewise, oil became a major new commodity. By the 1920s, oil fields had been established in many countries including Canada, Peru, Poland, Russia, Sweden, Ukraine, the United States, and Venezuela. Natural gas, meanwhile, became more important during the second half of the 20th century. Hydropower had been around since ancient times – as a way of grinding grain, for example. It spread to Europe in the Middle Ages and provided mechanical power for textiles and machines during the Industrial Revolution. It became a source for generating electricity in 1882 and expanded rapidly in the 20th century, with gigantic projects such as the Hoover Dam completed in 1936. Hydroelectric power reached 1,000 terawatt hours by the 1960s or about one-quarter of annual global electricity demand. Nuclear electricity production was first demonstrated in 1948 and grew rapidly from the 1960s onward. However, accidents such as Three Mile Island (1979) and Chernobyl (1986) led to a subsequent stagnation. The economic costs of building plants led to further concerns about the risks of the industry. With ongoing advances in science and technology, new forms of electricity production emerged. The American satellite Vanguard 1, launched in 1958, became the first man-made object in space to incorporate solar electric power, with a 0.1W panel. The energy crisis of the 1970s led to a groundswell of public interest in solar research, development, and deployment. The first wind turbine for electricity production was built by Scottish engineer James Blyth in 1887. While this renewable energy technology saw widespread adoption in some countries, such as Denmark, it remained a minor source of electricity on a global basis. However, the installed capacity began to grow exponentially in the closing years of the 20th century. Offshore wind production joined the mix in 1991, which had the advantage of capturing stronger and more persistent winds at sea. At the start of the 21st century, the global electricity sources consisted of coal (39.2%), natural gas (18%), oil (7.4%), nuclear (17%), hydroelectric (17%) and renewables (1.4%). The world's population, which stood at roughly a billion in 1800, exceeded six billion. Concerns over the environmental impacts of fossil fuels – already substantial in the 20th century – now began shifting to a whole new level. Air, land, and water pollution had been issues for as far back as fossil fuels existed. Carbon now constituted a much more serious and longer-term threat in the form of global warming. It would become perhaps the greatest challenge for humanity in the modern era. The basic physics of heat-trapping greenhouse gases were established by the work of Fourier (1824), Tyndall (1859), Arrhenius (1896) and others who showed that without carbon dioxide (CO2), the Earth would be 30°C colder. By the early 21st century, carbon isotope ratios proved that human activity had increased the atmospheric concentration of CO2 by 45% – an unprecedented change when viewed on geological timescales, matched only by catastrophic events such as large asteroid strikes. The annual output of greenhouse gases from human activity now surpassed all of the world's volcanoes by two orders of magnitude. As a result, the global average temperature had warmed by 1.0°C (1.8°F) since the Industrial Revolution and was projected to rise by as much as 6°C by the year 2100. The Intergovernmental Panel on Climate Change (IPCC), established in 1988, had sought to improve the scientific understanding of climate change and facilitate the discussion of possible response options. As well as publishing reports, authored by thousands of scientists contributing on a voluntary basis, the organisation held conferences around the world every year. However, while the IPCC member states agreed that climate change was real and the current period of rapid warming was human caused, finding solutions proved extremely challenging. The level of action needed to stabilise greenhouse gas concentrations and reverse the temperature trend was simply too great, from an economic and political standpoint. To maintain a 50% probability of keeping the global temperature rise below 2°C, at least two-thirds of known fossil fuel reserves would need to remain buried in the ground. Even as the world continued to warm, with impacts becoming greater and more obvious, little or no progress could be achieved. Then in 2015, a significant milestone occurred at the COP21 summit. Known as the Paris Agreement, this recognised the need to keep the global average temperature rise below 2°C (generally agreed as the "point of no return"), and to pursue efforts to limit the increase to 1.5°C and committed all countries to reduce their carbon emissions for the first time. Despite this new consensus, actually implementing the required policies was another matter entirely. Fossil fuels were the richest and most profitable enterprise in human history, forming the bedrock of the global economy and employing millions of people. With some $500 billion in annual subsidies, industry lobbyists held considerable influence, and governments were reluctant to give up trillions of dollars’ worth of resources. Russia, for example, had the largest natural gas reserves in the world; Middle Eastern countries such as Saudi Arabia and Iraq were abundant in oil; and the largest coal reserves were in the United States, Russia, China, India, and Australia. Fossil fuels provided cheap and easy access to energy, helping to lift billions out of poverty. On the other hand, shifting away from fossil fuels and into clean energy provided many opportunities, regardless of climate change. The global economic costs of air pollution amounted to $5 trillion per year as a result of productivity losses and degraded quality of life, with up to 7 million deaths. Solar and wind power – alongside their health benefits – could provide further advantages, such as new jobs, technology innovation, energy independence, local and decentralised energy protected from blackouts, less price volatility, and so on. Besides solar and wind, other alternatives were available too, in some cases provided by geography. Iceland, for example, produced almost 100% of its electricity from geothermal and hydropower. As the political debates raged on, more and more businesses could sense the winds of change and the beginning of an energy revolution. Investment now poured into the nascent renewable sector, as the cost of solar and wind power fell dramatically. From tens of dollars per watt in the 1970s, solar had fallen to $5 per watt by the year 2000 and continued to plunge, reaching less than a dollar per watt by the early 2010s. At the same time, solar cell efficiencies were improving with new production techniques, nanotechnology materials, and other breakthroughs. Wind turbines were growing larger, some reaching skyscraper sizes, with far greater capacities than before. Meanwhile, related technologies continued to advance, such as batteries/storage, home energy efficiencies, smart grids, and electric vehicles. The writing was on the wall for fossil fuels. However, the industry would not disappear without a fight, and continued to expand. New reserves were being exploited in the form of fracking, Deepwater, and other unconventional techniques. In a rather ironic development, the Arctic became a target for drilling, even as ice cover in this region hit record lows. By the late 2010s, the atmospheric concentration of CO2 had surpassed 400ppm – a level not seen for millions of years – and the cultural zeitgeist shifted more and more in favour of action on climate change. Support for renewables was especially strong from the younger generations, now holding large-scale protests on a regular basis in many cities around the world. The percentage of electricity from coal had peaked worldwide at 42% and now entered a terminal decline, as the adoption of renewables continued to accelerate and surpassed 10% globally. In 2017, the cost of solar-generated electricity became cheaper than coal in India. A number of countries now pledged to eliminate coal power plants within a decade. Recognising the existential threat to their industry, coal companies had begun to research carbon sequestration methods and extolled the promise of "clean coal", but the decline continued. The 2020s became a pivotal decade for renewables, as they entered perhaps their most disruptive phase. Solar was to play a major role in the energy transition. Many cities and larger regions now actually mandated that rooftop solar be fitted on all newly constructed buildings. In addition to these home, community and workplace installations, utility-scale projects attained gigantic proportions, dwarfing previous power plants. Likewise, many enormous wind projects were now coming online, such as the offshore East Anglia Zone with 7.2 gigawatts (GW) of capacity. Floating offshore wind – contrasted with fixed-bottom offshore wind farms – remained a niche sector throughout the 2020s. However, improvements in technology and cost competitiveness led to a major expansion in the 2030s. This allowed offshore wind to be deployed in vastly greater areas, as opposed to shallow-water locations near coastlines. Ongoing, exponential growth in renewables continued to eat away at the market share of fossil fuels. Coal in particular, now undergoing a full-scale collapse, contributed to a financial crisis on par with the Great Recession of the late 2000s and early 2010s. Future historians would argue that this "carbon bubble", exposing trillions of dollars in stranded assets, marked the beginning of the end for oil, gas, and coal-based electricity. Some of the richest countries were now fully decarbonising their power supplies. Alongside this, smart cities were improving energy conservation, while innovations such as wireless power transfer became increasingly common. Many cafes, restaurants, airports, and other public areas now featured "wi-tricity" for recharging electronic devices. By the 2040s, as the global average temperature rise approached the critical 2°C threshold, the battle against climate change appeared to be lost. However, carbon sequestration – including mass reforestation programs, direct air capture, and other methods – still offered hope. Carbon taxes were now implemented by most nations, while annual subsidies to fossil fuels had dwindled substantially, making future investments ever more unattractive. The first solar power satellites, beaming energy down from space, provided a way of capturing sunlight 24 hours a day. This power could then be distributed to isolated locations, or disaster-hit areas with damaged infrastructure. Fusion power – another major development in the 2040s – appeared to be nearing commercial availability, with a successor to the ITER experimental reactor showing great promise. By 2050, the percentage of electricity from renewables (excluding hydroelectric) had reached 50% globally. Including hydroelectric and nuclear, the zero-carbon figure now stood at nearly 70%. Coal was being phased out almost everywhere, except in a small number of the poorest countries. Oil, too, was on the verge of disappearing from use in electrical systems. Only natural gas remained a significant source of carbon-based electricity. Electric and plug-in hybrids, already ubiquitous in the West, now formed the majority of cars in Africa, the Middle East, Asia and elsewhere. Large ships and even airliners were now going fully electric. Further developments included the emergence of continent-wide "super grids" able to link together vast regions and optimise their clean energy supplies at all times of the year. The use of high-voltage direct current (HVDC) transmission, instead of traditional alternating current (AC) lines, allowed far greater efficiency since DC lines had much lower electrical losses over long distances. By 2060, both coal and oil had been virtually eliminated from the global electricity supply. Practically every new building in the world now utilised solar power in some way   – either via rooftop modules, or transparent solar embedded in windows and other surfaces. Given the state of the global climate, fossil fuels had become extremely taboo, with most countries outright banning them for electrical generation. By the late 2070s, the global transition is essentially complete. Even the poorest nations are now achieving zero carbon electricity, as the last of their natural gas power plants are finally shut down. After more than a century of research, solar cells are nearing their theoretical limit for efficiency – nanotechnology has largely perfected their internal structures. The latest generation of modules, not only cheaper and more affordable than ever, can also access a wider part of the electromagnetic spectrum (i.e., beyond visible light, into the infrared), are sturdier and longer lasting, and can be configured to alter their appearance and blend into existing buildings more easily. Having begun to achieve commercial viability in the 2040s, huge clusters of solar power satellites now orbit the Earth and provide 100% coverage to wherever electricity is needed. Wind and offshore wind turbine projects have continued to expand in recent decades and will soon be powerful enough to change the very air currents of Earth. Hydroelectric has declined in terms of its overall share of the world's electricity, but remains a significant source of power, helped by new developments in wave, tidal, and ocean thermal energy. Meanwhile, nuclear fusion has moved from the laboratory into commercial uses in a number of countries, although it remains at a low percentage of global electricity for now – partly due to the high capital investment costs. However, it sees greater usage in the 22nd century as the technology is miniaturised and increasingly used for space-based applications. Advances in space travel and tourism. By the late 21st century, the frontiers of human space exploration have shifted from the Moon, Mars, and inner Solar System to more distant locations in the outer Solar System. The National Aeronautics and Space Administration (NASA), formed in 1958, remains in operation after 120 years. However, the agency's funding is now negligible compared to that of the private sector, which has turned space into a multi-trillion-dollar industry. The cost of launching people and cargo into space has declined substantially over the last century. From $85,000 per kilogram a hundred years earlier, it is now less than a dollar per kilogram. This means that even people on relatively low incomes have access to Earth orbit – thanks to a new generation of rockets and space planes, alongside the recent development of a space elevator. Basic orbital cruises are therefore taken for granted by most people and regarded in much the same way as a long-haul airplane flight or visit to a famous landmark; something that most people do every few years. For those with a little extra money, trips around the Moon are available, along with extended stays at a number of private hotels in cislunar space. For the more adventurous, guided excursions on the lunar surface are possible. However, being somewhat more expensive and complicated, these are generally still reserved for those on above-average incomes. In addition, anything longer than a simple tour requires dedication and training that means it is often considered a rather extreme leisure activity, akin to mountain climbers or deep-sea divers of the past. Among the most popular tour destinations are the Apollo landing sites, which have been surrounded by perimeter fencing and made into UNESCO World Heritage sites. Beyond these "tourist traps" are many areas of geological interest, which are being visited by explorers and researchers keen to claim their place in the history books. Mars had gained a human presence some decades earlier. The first permanent base was followed by another, then another. By 2078, the population, infrastructure and self-sufficiency have all grown to such an extent that Mars will soon be ready to declare independence. Water, food, and energy supplies are being handled by increasingly heavy use of robotics and general automation. Originally intended for scientists and other researchers, these stations are now experiencing a steady influx of regular citizens, keen to escape Earth and begin a new life on the Red Planet. In addition to habitats on the Moon and Mars, a large and ever-growing number of mining operations are dotted around near-Earth objects, main belt asteroids and others, with some as far out as the Trojans of Jupiter. These are providing an abundance of previously rare metals and minerals, which has caused a dramatic fall in the price of certain commodities. The operations are almost entirely automated and overseen by powerful artificial intelligences, requiring little or no human input. A network of refuelling stations now provides a ready source of propellant – made from separating water into hydrogen and oxygen, for example – the equivalent of space-based filling stations that enable longer, cheaper, and faster journeys. With much of the inner Solar System now home to a thriving economy, attention has turned to the untapped potential of the gas giants. For government and commercial interests, by far the most promising candidate is Saturn's largest moon, Titan. The unmanned Huygens probe had landed successfully on this strange world in 2005, returning the first pictures from its surface. The journey had taken almost eight years. Subsequent probes to the outer Solar System, such as New Horizons in 2015, were of similarly long durations. In the years and decades following these spacecrafts, however, advances in propulsion technology led to reduced travel times – the use of solar sails to generate a small but continuous acceleration, for example, meant that higher speeds could be achieved over time. Other notable innovations included nuclear pulse propulsion and progress with antimatter. Titan's 1.4-billion-kilometre distance was no longer a significant obstacle. These and other developments were now being amplified and optimised by powerful artificial intelligences. By the 2070s, many formerly insurmountable challenges with space travel have been overcome. One of the biggest problems encountered by earlier colonists on the Moon and Mars was how to deal with radiation. On Titan, the issue is eliminated – thanks to the satellite's protective atmosphere, which is 45% thicker than Earth's. Because of this high pressure and lack of radiation it is actually possible to walk around on the surface without a bulky, airtight space suit, instead using just an oxygen mask and heavy clothing with embedded heating elements. Buildings can also be constructed and maintained more easily on Titan, with simpler designs that resemble polar bases on Earth. Occupants can take a somewhat more relaxed approach to keeping habitats airtight. Although leaks must be fixed, there is no immediate danger of death. A simple piece of tape can be applied temporarily until a proper repair is completed. For a cyborg or other suitably "upgraded" human, these environmental issues may be even less of a problem. The very high ratio of atmospheric density to surface gravity also greatly reduces the wingspan needed for aircraft to maintain lift; so much so that a human can strap on wings and fly great distances while wearing a lightweight spacesuit. Initially, the population of Titan is restricted to scientists and other government/corporate personnel – but like the Moon and Mars, it eventually grows to include regular citizens brave enough to make the voyage and start a new life. During the 22nd century it becomes a major hub for science, research, commerce, and tourism in the outer Solar System.

2079: Asbestos production is ending globally. Asbestos is a group of naturally occurring silicate minerals, all of which form into long, fibrous, crystalline shapes. Viewed at the microscopic level, each fibre is composed of millions of tiny "fibrils" that can be released by abrasion or other processes. Asbestos was first mined in ancient times, but large-scale industrial extraction only began in the 1870s. It was regarded as a revolutionary new material for its physical properties that included resistance to fire, heat, and electricity, along with sound absorption and good tensile strength. Its affordability led to widespread use around the world – in applications such as electrical products, building insulation, fabric and mats, steam engines and turbines. By 1950, over a million tons were being produced globally. However, this rapid growth was accompanied by increasing concerns about health impacts. Researchers were beginning to notice a large number of early deaths and lung problems in mining towns. It was subsequently confirmed that asbestos dust and fibres – too small for the naked eye, and often with a long period between first exposure and the onset of symptoms – were a major public health issue. By the mid-20th century, it was clear that, far from being a beneficial and sustainable material, asbestos was extremely hazardous. The United States government and various industries were criticised for not acting quickly enough. In the late 1970s, court documents proved that asbestos industry officials had known of the dangers since the 1930s but concealed them from the public. The continued long-term use of asbestos after harmful health effects were known or suspected, and the slow emergence of symptoms decades after exposure ceased, made asbestos litigation the longest, most expensive mass tort in U.S. history, involving more than 8,000 defendants and 700,000 claimants. Worldwide asbestos production reached a peak of 4.8 million metric tons in 1978, before declining rapidly in the 1980s and 90s. Many countries were now strongly regulating or outright banning its use. After a brief plateau, production fell rapidly again, from 2.1 million tons in 2000 to 1.3 million tons by 2017. Despite the severity of asbestos-related diseases, the material remained in widespread use during the early 21st century. Production was still high in countries such as Brazil, China, India, Indonesia, Kazakhstan, and Russia – fuelled by demand for cheap, mass-produced building materials in much of the developing world and supported by aggressive industry campaigns and lobbyist groups. The World Health Organisation (WHO) estimated that 125 million people were being exposed to asbestos in the workplace every year. More than 60 countries had enacted bans by 2020 – including Australia, Canada, all European Union (EU) members, Japan, and South Korea, alongside a number of nations across Africa, the Middle East, and South America. Although regulations existed in the United States, however, there was still no official federal ban of the carcinogen. The last domestic producer in the U.S. had ceased operations in 2002, but imports continued to meet manufacturing needs. Nevertheless, consumption in the U.S. was in terminal decline, which led to eventual obsolescence. More and more nations continued to phase out asbestos, in response to both health and liability issues. Rising incomes in Africa, Asia and elsewhere also meant that substitutes were increasingly viable. Worldwide production was now shrinking to a fraction of its former level. The material was persistent in terms of its long-term impacts, however. One notable example was in the 2040s, when a spike in cases of lung cancer was recorded in the U.S. and New York in particular. During the terrorist attacks of September 2001, hundreds of thousands of people had been exposed to asbestos embedded in the collapsing towers' drywall, insulation, fireproofing and steel structures. Decades later, many survivors were now feeling the health effects. While bans were successfully covering new buildings, new products and so on, vast amounts of asbestos were extant in older buildings, older items, etc. Recognising this ongoing danger, countries in both the developed and developing world worked to remove asbestos and eradicate it completely. This had traditionally been a risky and expensive process but was gradually becoming cheaper and safer through the use of robots. By the late 2070s, the last remaining pockets of asbestos mining have closed – 100 years after the peak in global production and 200 years after the first industrial-scale operations. Health impacts continue in some countries until the end of the 21st century, but are less of a problem than before, as the medical advances of this time can substantially improve survival rates for lung cancer, mesothelioma, asbestosis, and other such illnesses. This once ubiquitous and promising material is finally consigned to history. While asbestos was being phased out, new fears about the health impact of micro- and nano-sized particles were arising as nanotechnology began to emerge and become widespread in the early- to mid-21st century. Graphene, carbon nanotubes and other nanostructures were providing revolutionary breakthroughs in a range of industries, but some observers noted the parallels with asbestos in its early years and viewed these developments with growing unease. Although some of these concerns were valid, they proved to be overstated. Nanotechnology applications were fundamentally different to those of asbestos, with atoms far more tightly bound and less likely to break apart or be inhaled. Industry regulations and oversight were also vastly improved between the 19th and 21st centuries, enhanced in particular by powerful AI to monitor the slightest defects and report any problems. SQL Server databases are hit by a major glitch. On 6th June 2079, the "smalldatetime" fields in SQL Server databases wrap around to January 1, 1900. Similar to the Year 2038 problem, this is caused by a limited number of possible element ranges. There are major errors for any vintage/antique computers still using these systems, though most of the surviving examples are in museums by now. Structured Query Language is exactly a century old, having been commercially introduced in 1979.

2080: Some humans are becoming more non-biological than biological. Today, the average citizen has access to a wide array of biotechnology implants and personal medical devices. These include fully artificial organs that never fail, bionic eyes and ears providing Superman-like senses, nanoscale brain interfaces to augment the wearer's intelligence, synthetic blood and bodily fluids that can filter deadly toxins and provide hours' worth of oxygen in a single breath. Some of the more adventurous citizens are undergoing voluntary amputations to gain prosthetic arms and legs, boosting strength and endurance by orders of magnitude. There is even artificial skin based on nanotechnology, which can be used to give the appearance of natural skin when applied to metallic limbs. These various upgrades have become available in a series of gradual, incremental steps over preceding decades, such that today, they are pretty much taken for granted. They are now utilised by a wide sector of society – with even those in developing countries now having access to some of the available upgrades due to exponential trends in price performance. Were a fully upgraded person of the 2080s to travel back in time a century and be integrated into the population, they would be superior in almost every way imaginable. They could run faster and for longer distances than the greatest athletes of the time; they could survive multiple gunshot wounds; they could cope with some of the most hostile environments on Earth without too much trouble. Intellectually, they would be considered geniuses – thanks to various devices merged directly with their brain. Construction of a transatlantic tunnel is underway. Built from advanced automation and robots – and controlled by AI – this is among the largest, most ambitious engineering projects ever undertaken. With hyper fast Maglev up to 4,000mph, passengers using the tunnel can be delivered from Europe to America in under an hour. Carbon nanotubes, along with powerful geo-sensing devices, have been paramount in the structure's design – these can self-adjust in the event of undersea earthquakes, for example. Also noteworthy is that the train cars operate in a complete vacuum. This eliminates air friction, allowing hypersonic speeds to be reached. The cost of this project is in the region of $88-175bn. Many former Winter Olympics venues no longer provide snow. Rising temperatures have rendered many former Winter Olympic sites "climatically unreliable" – that is to say, unable to provide snow on a regular basis. Although geoengineering efforts have been underway for some time, these have not yet managed to stabilise the global climate. Former locations that are now either unsuitable or forced to rely on artificial snow include Sochi (Russia), Grenoble (France), Garmisch-Partenkirchen (Germany), Chamonix (France), Vancouver (Canada) and Squaw Valley (US), with a number of others remaining at high risk. Aside from the Olympics, winter sports in general are increasingly being moved indoors, or are taking place in simulated environments. Polar bears face extinction. Between 2000 and 2050, polar bear numbers dropped by 70 percent, due to shrinking ice sheets caused by global warming. By 2080, they have disappeared from Greenland entirely – and from the northern Canadian coast – leaving only dwindling numbers in the interior Arctic Archipelago. Of the few which remain, ice breaking up earlier in the year means they are forced ashore before they have time to build up sufficient fat stores. Others are forced to swim huge distances, which exhausts them, leading to drowning. The effects of global warming have led to thinner, stressed bears, decreased reproduction, and lower juvenile survival rates. One in five lizard species are extinct. The ongoing mass extinction has claimed many exotic and well-known lizards. One in five species are now extinct as a result of global warming. Lizards are forced to spend more and more time resting and regulating their body temperature, which leaves them unable to spend sufficient time foraging for food. Deadly heatwaves plague Europe. Heatwaves greater than that seen in 2003 have become annual occurrences by this time. In the peak of summer, temperatures in major cities such as London and Paris reach over 40°C. In some of the more southerly parts of the continent, temperatures of over 50°C are reported. Thousands are dying of heat exhaustion. Forest mega-fires rage in many places while prolonged, ongoing droughts are causing many rivers to run permanently dry. Spain, Italy, and the Balkans are turning into desert nations, with climates similar to North Africa.

2083: Hyper-intelligent computers. Due to Moore's Law, $1000 of computing power is now equivalent to a billion Earth's worth of human brains. Laptop-sized computers of today can perform the equivalent of all human thought over the last ten thousand years in less than ten microseconds. Technology is progressing so fast that – in order for people to comprehend it – neural upgrades have become necessary on a regular basis. V Sagittae becomes the brightest star in the night sky. In 2083, a previously faint star system known as V Sagittae erupts in a spectacular nova outburst, becoming the brightest star in the night sky. V Sagittae is a cataclysmic variable binary, located approximately 7,800 light years from Earth. During earlier observations, it was found to consist of a main sequence star of about 3.3 solar masses and a white dwarf of about 0.9 solar masses, orbiting each other every 0.5 days. At such close range, material from the larger star accreted onto the white dwarf at an exponentially increasing rate. From 1890 to 2020, the pair brightened by a factor of 10 and continued to gain in magnitude throughout subsequent decades. Astronomers considered the difference in mass between the two as highly unusual – in all other cataclysmic variables (CVs), the white dwarf was more massive. This made V Sagittae the most extreme of all known CV systems, about 100 times more luminous than normal and with a powerful stellar wind equal to the most massive stars prior to their deaths. Observed to be in the late stages of an in-spiral, the two would ultimately collide and coalesce, creating a powerful burst of light. This occurs in 2083 and results in a tremendous release of gravitational potential energy, driving a stellar wind as never before seen, and raising the system luminosity to near that of a supernova at its peak. This explosive event lasts for more than a month, as the objects merge into one star, during which time V Sagittae outshines even Venus and Sirius. The merger creates a degenerate white dwarf core, and a hydrogen-burning layer, surrounded by a vast gas envelope consisting mostly of hydrogen – eventually becoming a red giant.

2084: Conventional meat is becoming obsolete. For many thousands of years, humans had practiced animal husbandry. This began during the Neolithic revolution from around 13,000 BC, when animals were first domesticated, marking a transition from hunter-gatherer communities to agriculture and settlement. By the time of early civilisations such as ancient Egypt, various animals were being raised on farms including cattle, sheep, goats, and pigs. The 15th and 16th centuries witnessed the "Columbian Exchange", when Old World crops and livestock were brought to the New World. Other historical developments included the British Agricultural Revolution, or Second Agricultural Revolution, which saw an unprecedented increase in labour and land productivity in Britain between the mid-17th and late 19th centuries, with livestock breeds improved to yield more meat, milk, and wool. The so-called Green Revolution, or Third Agricultural Revolution, occurred in the mid-to-late 20th century. This boosted agricultural production worldwide – particularly in developing regions – through a series of technology transfer initiatives. An expansion of irrigation infrastructure, chemical fertilizers, pesticides, mechanisation, better land management and other modernisation techniques allowed the production of new and higher-yielding varieties of wheat, rice, maize, and other food grains. The resulting increase in harvests led to a major improvement in available food supplies for both humans and as livestock feed. In the early 21st century, however, the global agricultural system faced new and profound challenges. Demand for meat products had soared, led by Asia with its vast populations and rapidly rising incomes. Due to urban expansion and other environmental pressures, arable land was on course to decline from 0.38 hectares per capita in 1970, to a projected 0.15 hectares per capita by 2050. Topsoil had suffered immense damage, with tens of billions of tons being lost through intensive farming each year, one-third having been eroded since the Industrial Revolution and was forecast to completely disappear by 2075. Fresh water had become increasingly scarce, while nitrogen and other agricultural runoffs polluted the world's rivers, lakes, seas, and oceans. In addition to all this, global reserves of phosphorus, essential for many agricultural systems, appeared to be reaching a peak, with most of the remaining supplies confined to just four countries: Morocco, China, Algeria, and Syria. Another major issue to emerge around this time was antibiotic resistance. In the past, antibiotics were routinely added to certain compound foodstuffs in order to maximise livestock health and growth, but this practice was increasingly frowned upon in many countries due to the risk of drug-resistant bacteria. By 2020, approximately 700,000 people were dying each year from such infections, with 60% of these diseases originating from animals. This figure was on track to reach 10 million per year by 2050, becoming a bigger killer than cancer. Public attitudes towards meat in general were shifting. The cultural zeitgeist was gradually moving away from traditional animal slaughter and in favour of new, alternative ways to produce meat. The climate crisis, resulting in part from the livestock industry (accounting for 15% of greenhouse gas emissions), gave added momentum to the need for change. Novel vegan meat replacements, completely made of plant-based inputs, emerged as a popular substitute in the 2010s. With sophisticated production techniques making use of haemoglobin and binders to hold ingredients together – extracted via fermentation from plants – they could mimic the sensory experience of meat and even blood. However, an even more realistic alternative was in development: cultured meat, produced by in vitro cultivation of animal cells. In 2012, Dutch scientists created a rudimentary form of synthetic meat, consisting of thin strips of muscle tissue derived from a cow's stem cells. The following year, they became the first group to produce a complete burger, grown directly from animal cells. The first lab-grown burger had cost $384,000. This was more than just a novelty, however. Other scientists and companies were beginning to research and develop their own versions. Tens of millions of dollars began to flow into this nascent industry. By the early 2020s, cultured meat was commercially available in a number of restaurants and supermarkets around the world. As the technology advanced – not only becoming cheaper and easier to utilise, but also expanding the types of cell tissue available – the number of start-ups began to explode. Meanwhile, larger, and more established companies steadily increased their investments. A variety of cultured meats entered the market – everything from burgers to hot dogs, meatballs, nuggets, sausages, and steaks. Production involved many of the same tissue engineering techniques traditionally used in regenerative medicine, such as inducing stem cells to differentiate, with bioreactors to "grow" the foods, and scaffolds to support the emerging 3D structures. By accumulating cells in a strictly controlled environment, the final products could be manufactured in far cleaner and safer ways, free from any harmful organisms. The introduction of food labels helped consumers to identify synthetic or "clean" meat in shops, as had been done for organic, free range, and certain other classifications in the past. Mainstream adoption occurred in many countries, as these products became cost competitive. By the late 2030s, the worldwide market for conventional slaughtered meat was being overtaken by the combination of cultured meat and novel vegan replacements. By the early 2040s, cultured meat alone had achieved market dominance, despite attempts by established farming lobbyists to slow progress. New biotechnology methods continued to disrupt not only the meat industry, but the entire food sector as a whole range of synthetic products such as milk, egg white, gelatine and fish could now be created with similar technology. As the decades went by, the consumption of traditional meat became increasingly taboo – morally equivalent to foie gras, or shark fin soup. This was driven in part by animal welfare concerns, but a greater ethical issue was the climate crisis now rapidly worsening and engulfing much of the world. Cultured meat had the potential to drastically reduce environmental impacts, with 95% lower greenhouse gas emissions and requiring 99% less land. Having consolidated into a mature industry across the developed world, cultured meat became more widespread in poorer nations during the 2050s. However, traditional meat still accounted for hundreds of billions of dollars in revenue and remained a significant commodity in some regions, with cultural and societal aspects that went beyond mere economics. Although continuing to shrink globally, it now entered an S-curve of slower decline. The industry would survive for another few decades yet. By the 2080s, the world has changed almost beyond recognition. Biotechnology has swept aside traditional agriculture and its many related infrastructures – rendering vast areas of farmland redundant and providing an opportunity to restore forests, rivers, lakes, and other natural features. Many regions have outright banned the slaughter of animals. Most households with at least a middle income or above have access to a kitchen appliance that can replicate meat products and other foodstuffs within a matter of minutes. Many people today look back in horror at the farming practices of the past, which killed upwards of 65 billion animals per year and had gigantic environmental impacts. In 2084, only a negligible percentage of the world's people still raise livestock for meat. Androids are widespread in law enforcement. Fully autonomous, mobile robots with human-like features and expressions are deployed in many cities now. These androids are highly intelligent, able to operate in almost any environment and dealing with various duties. As well as their powerful sensory and communication abilities, they have access to bank accounts, tax, travel, shopping, and criminal records, allowing them to instantly identify people on the street. The presence of these machines is freeing-up a tremendous amount of time for human officers. They are also being used in crowd control and riot situations. With inhuman strength and speed, a single android can be highly intimidating and easily take on dozens of people if needed. Special controls are embedded in their programming, however, to prevent the use of excessive force.

2085: Five-year survival rates for brain tumours are reaching 100%. Because of their invasive and infiltrative nature in the limited space of the intracranial cavity, brain tumours were once considered a death sentence. Detection usually occurred in advanced stages when the presence of the tumour had caused unexplained symptoms. Glioblastoma multiforme – the most common and most aggressive malignant primary brain tumour in humans – had a median survival period of only 12 months from diagnosis, even with aggressive radiotherapy, chemotherapy, and surgical excision. In the 21st century, however, detection and treatment methods improved greatly with nanorobotics, gene therapy and technologies able to scan, analyse and run emulations of complete brains in astonishing detail. Alongside this was the gradual emergence of "transhuman", who began utilising permanent implants in their brains and bodies, alerting them to the first signs of danger. Towards the end of this century, five-year survival rates for brain cancer are approaching 100% in many countries, the US being among the first.

2086: Hinkley Point C is decommissioned. Hinkley Point C, constructed between 2018 and 2026, became part of a "nuclear renaissance" in the UK. This power station supplied nearly 10% of the UK's total electricity demand. After 60 years of operation, the aging plant is finally being shut down. By now, fission has been largely supplanted by fusion.

2090: Religion is fading from European culture. In some European nations, the number of people considering themselves to be non-religious has increased from around 30% in 1980, to over 90% now. Although large numbers of Muslims populate the continent, a substantial portion are now only "culturally" Muslim, rather than having a literal interpretation of the Koran. Mainstream Islam has begun a reformation and modernisation in recent years – aided by vast improvements in education, combined with the broad homogenisation of culture resulting from globalisation, the Internet, various international agreements, and other factors. Medical advances are undermining religion as a whole, by greatly diminishing the fear of death, while developments in AI, robotics and biotechnology are beginning to trivialise the miracles on which many ancient religions are based. The increasing presence of androids in society – along with other forms of sentience – is adding a whole new dimension to the way humans view themselves and their place in the Universe. The ability to communicate with certain artificially enhanced animals (such as dolphins, monkeys, and domestic pets) is also contributing to this trend. Spirituality continues to play a role in European cultures – but is now based more on nature and physical reality, rather than myths, dogma, or supernatural forces. The USA still lags far behind Europe in terms of atheistic belief, however. It will be another century before America reaches the same level, even longer for certain parts of Asia. Even then, a small percentage of citizens will continue to worship a God (or Gods), well into the next millennium. These people will tend to be those who reject science and technology or have purposefully chosen to isolate themselves from the rest of the world. Hypersonic vactrains are widespread. Much of the world has now established a hypersonic, evacuated tube transport system connecting major population centres. Its routes extend primarily throughout Russia, Northern Europe, Canada and the US. These trains are more advanced versions of the slower, simpler prototypes first introduced decades previously. This form of transport works by combining the principles of maglev trains and pneumatic tubes. The trains, or vactrains as they are called, travel inside a closed tube, levitated and pushed forward by magnetic fields. After passing through an airlock, the train cars enter a complete vacuum inside the tube. With no air friction to slow it down, the vactrain can reach speeds far beyond that of any traditional rail system. The fastest routes can now reach speeds of around 4,000 mph (6,400 km/h) – around five times the speed of sound – compared to a 300-mph maglev train a century earlier. With speed of this magnitude, any city within the network can be reached in just a few hours, even if located on the other side of the planet. A number of new routes are in the planning stages as well, including a system of truly massive transoceanic connections. This is possible thanks in part to the relative cheapness (10% the cost of high-speed rail), as well as its energy efficiency. Since the train cars simply coast for most of the trip after being accelerated, slowing down also allows most of the energy to be regained by the track system. The modular design of tubes enables construction to be automated. One of the main issues designers had to contend with was the problem of safety. At such high speeds, even the slightest bump in the track or misalignment could end in disaster. In addition, the sheer size of the tube systems means that engineers have to deal with the movements of tectonic plates – a particular problem when crossing fault lines. In order to deal with this and disasters such as earthquakes, an immense system of gyroscopes and adjusters are maintained along the length of each route. These are controlled by an automated system of computers receiving constant streams of weather and seismic data, adjusting, and bracing the track in real time. Leaks into the vacuum are managed through a combination of self-healing materials and redundant plating. The late 21st century is a bleak, fragile time for humanity, with much rebuilding to do. However, the resurgence of international travel (following a collapse in earlier decades) is contributing once more to a homogenization between stable countries, with ease of transport bringing the world closer together. One particular area in which this helps is for rapid movement and resettling of refugees affected by climate-related disasters.

2091: SpaceX's "Starman" has a close encounter with Earth. On 6th February 2018, U.S. aerospace company, SpaceX, conducted the maiden launch of the Falcon Heavy, a partially reusable heavy-lift launch vehicle designed to transport people and cargo beyond Earth orbit. The rocket was carrying a Tesla Roadster, which belonged to SpaceX founder Elon Musk, as a dummy payload. Inside the electric sports car was a mannequin, nicknamed Starman, who appeared to "drive" the car while wearing a spacesuit. The Roadster, released upon reaching outer space, had sufficient velocity to escape Earth's gravity. It entered into an elliptical heliocentric orbit that would eventually cross the orbit of Mars, reaching a maximum distance from the Sun at aphelion of 1.66 AU. In 2091, after a period of 73 years, Starman returns to cislunar space and makes a close approach of Earth. He continues to circle the inner Solar System for millions of years into the future, with a small probability of colliding with Earth (6%) and Venus (2.5%) during the first million years.

2092: West Antarctica is among the fastest developing regions in the world. As a result of global warming, temperatures at the poles have risen more than anywhere else in the world – meaning that parts of Western Antarctica are now comparable with the climates of Alaska, Iceland, and northern Scandinavia. In some areas, melting of surface ice has resulted in conditions appropriate for large-scale human settlement. The icy continent today would be unrecognisable to observers from the 20th century: its northern peninsula is now home to a multitude of towns and conurbations, with a total population numbering in the millions. Even farming and crop growing is now possible in some of the most northerly areas, using genetic modification techniques. Rapid immigration from countries all over the world has created a diverse mixture of people and cultures flocking to this new land of opportunity. In a way, the settlement of Antarctica is similar to that of America in the 18th and 19th centuries. The highest density cities are becoming cultural melting pots similar to London and New York, though on a smaller scale.

2095: Hurricane defence fields are operational in the U.S. Compared to other forms of energy production, offshore wind power emerged relatively late in the modern era. Although land-based turbines had been around for centuries in some form, their offshore equivalents did not arrive until the end of the 20th century. The first commercial offshore wind farm was constructed in Denmark in 1991 and consisted of 11 units generating less than five megawatts (MW). The project was viewed with scepticism at the time, but the experience it provided led to cheaper and more efficient designs. Despite some downsides, offshore wind proved to have the major advantage of capturing much stronger and more persistent winds at sea. It could also be sited close to load centres along coasts, such as large cities, thus eliminating the need for new long-distance transmission lines. Furthermore, NIMBY opposition to construction was usually much weaker. Following the first installation at Denmark, more projects began to spring up around Europe in the 1990s. However, it remained a niche industry and would not take off until the early 21st century. The UK opened its first offshore wind farm in 2000, which had a modest capacity of just 4MW, but then rapidly developed much larger projects, overtaking Denmark to become the world leader by 2008. It also became home to the single largest project in the world – the 175-turbine London Array – completed in 2013. Europe as a whole boasted 11GW of capacity by 2015, more than 90% of the global total. America lagged far behind Europe. A small, experimental floating wind turbine – the 20-kilowatt Volturn US – was demonstrated in the Penobscot River, Maine, during 2013. The nation's first commercial offshore wind farm – the 30MW Block Island project at Rhode Island – finally began operation in 2016. These small beginnings were just a precursor, however, to a period of tremendous expansion. A number of states were setting highly ambitious targets for the years and decades ahead. The state of New York established a goal of 2,400MW to be completed by 2030. The neighbouring state of New Jersey aimed for 3,500MW by that same date. In 2016, an update to Massachusetts energy law committed the state to purchasing 1,600MW of offshore wind by 2027. Other states followed, launching plans for development around the Gulf Coast and Pacific west coast, as well as the Great Lakes region. Some 30 offshore wind farms totalling 24GW of capacity were now in planning. Offshore wind, like solar power, was poised to become one of the exponential growth industries of the 21st century. The U.S. Department of Energy estimated that the country had a gross resource potential of 10,800GW and a "technical" resource potential of 2,058GW – enough to power the whole of the United States. More and more projects were getting underway as developers realised the economic, social, and environmental benefits of this abundant form of energy. A setback occurred when the U.S. Navy placed restrictions on large areas of Pacific coastline – including all of San Diego and Los Angeles, extending up to the Central Coast. While this did not completely stymie the offshore wind industry in that part of America, it made the North and South Atlantic states a generally more attractive bet for investors. These places had much larger and easier to access wind resources in any case. Improvements in technology and cost, alongside increasing concerns about climate change, led to renewables playing a greater and greater role in the energy mix of the United States. By 2050, the nation was generating more than 86GW of electricity from offshore wind. The South Atlantic states, continuing the trend from earlier years, experienced particularly strong growth. These had the largest technical resource potential of any coastal region: nearly 30% of the U.S. total. With fossil fuels now rapidly heading for obsolescence – and next-generation turbines being deployed in enormous numbers – the second half of the 21st century would provide even greater opportunities for expansion. Advances in wind technology continued to reduce the cost, while boosting the efficiency, scalability, and durability of turbines. Powerful artificial intelligences were now emerging to handle the planning, construction, operation, and maintenance of wind farms, using vast teams of robots, drones, and other automated systems. Among the most promising breakthroughs were floating platforms, which allowed turbines to be sited further out to sea (rather than needing part of their structure to be anchored in the sea floor). This had been a niche sector throughout the 2020s but became cost-competitive during the 2030s and grew markedly from that decade onwards. Another design that became popular was ice-resistant floating technology. While not particularly useful in the Atlantic, Gulf of Mexico or Pacific, it helped to increase the resource potential of the Great Lakes region, making projects there more adaptable and sturdier. New super-strong but lightweight materials were enabling wind turbines to reach great sizes. From a few tens of metres in the 1990s, they increased to heights of 260 metres (853 feet) by 2021 and then grew even taller over subsequent decades. This was accompanied by an order of magnitude boost in their capacities, allowing tens of megawatts for a single turbine. During the second half of the century, wind farms established in the Atlantic became so large and widespread that some began to merge and combine to form gigantic networks spanning hundreds of miles. What had seemed like "megaprojects" back in 2050 were being dwarfed in scale by the new proposals organised by super-intelligent AI, which had an increasingly strong pull on the reins of the economy. Initially serving in a mere advisory role, these sentient programs were gaining ever more control in terms of energy and resource utilisation. Their capabilities were becoming strong enough to reshape not just the political and economic sphere, but the physical and natural systems of Earth itself. Recognising the existential threat of climate change, the AIs began to implement one of their most important long-term plans. By 2100, the South Atlantic states are generating over 300 gigawatts of offshore wind power – fully half of their technical resource potential. In addition to providing abundant clean energy, these vast networks now serve an incredibly important extra function: the sheer scale of wind farms now spread across the coasts is sufficient to interact with and alter the strength of approaching hurricanes. After using satellites to analyse wind patterns, thousands of floating platforms are manoeuvred by AI to their optimal positions. The energy extracted by these wind turbines, when fully optimised, can diminish a hurricane's peak wind speed by over 90 mph (148 km/h) – enough to reduce it from category 5 to category 1 or less – and reduce storm surges by almost 80%. These hurricane defence fields are not infallible and many other impacts of climate change remain to be solved. Nevertheless, the ability to partially control an aspect of the world's weather system is hailed as a major milestone in the fight against global warming. From a historical perspective, it represents a significant step towards humanity becoming a Type 1 civilisation on the Kardashev scale. Other regions are taking similar measures too – such as the typhoon-hit nations of Southeast Asia. Global fertility has stabilised at below 2.0 children per woman. By 2095, the global average number of children per woman has dropped below two, with even Africa now approaching this level after 130 years of declining fertility. There is now a remarkably similar rate between all regions on Earth – due to a range of factors including better education, improvements in health and living standards, access to contraception and shifting cultural perceptions on the value of children, ideal family size, etc. Many of the world's languages are no longer in use. Increased globalisation has resulted in the number of human languages declining from around 7,000 during the late 20th century, to less than half of that figure now. Many old sayings, customs and traditions have been abandoned or forgotten as the world becomes an ever smaller and more interconnected place. Changing social and economic conditions have forced many parents to teach their children the lingua franca, rather than obscure local dialects, in order to give them a better future. This is especially true in Africa and Asia. This broad homogenisation of culture has been further propagated by the stunning advances in technology which have swept the world. Many people in developed countries, for instance, are eschewing their native tongues altogether, relying on brain implants for everyday communications. The young especially are utilising this form of digital telepathy, now sufficiently advanced that verbally speaking has almost become an inconvenience, due to the longer time intervals required in conversations. Meanwhile, tribes people and isolated communities have lost homelands due to climate change, deforestation and shifting land uses. This forced migration and assimilation into the wider world has caused many ancient and rural languages to fade away. English, Mandarin and Spanish remain the lingua franca of international business, science, technology, and aviation.

2099: Sea levels are wreaking havoc around the world. Despite efforts to halt climate change, it came too late to save many lowland areas of the world. Sea levels rose almost two metres by the late 2090s, displacing hundreds of millions of people. The Maldives were especially hard hit, with most of the nation disappearing underwater. Countries around the globe were forced to begin large-scale evacuation and resettlement programmes, while trillions of dollars were spent on coastal defences. Over 80% of the Amazon rainforest has been lost. Due to the combined impacts of logging, drought, forest fires, desertification, agriculture, and industrial expansion, less than one-fifth of the Amazon now remains. In addition to mass extinctions of flora and fauna, many indigenous peoples' communities have vanished. All newborns have a median artery of the forearm. By the end of this century, micro-evolution of the human forearm has resulted in all newborns now having a persistent median artery. This biological anomaly had once been considered an embryonic structure, normally regressing by the 8th week of gestation. In the womb, it functioned as the main vessel supplying blood to the forearm and hand but would disappear once the radial and ulnar arteries developed in parallel on either side of it. However, an increasing prevalence in adults began to emerge from the 18th century onwards. More and more people retained all three of the arteries throughout their lives. By the end of the 19th century, more than 10% of newborns had the feature and this had grown to 30% by the year 2000. A study of cadavers in 2020 showed the phenomenon continuing, now observed in 35% of the population. Scientists attributed this micro-evolution to mutations of genes involved in median artery development, or health problems in mothers during pregnancy, or both. Furthermore, studies revealed humans to be evolving generally at a faster rate than at any point in the past 250 years. Other changes in anatomy seen over a relatively short time included the increasing absence of wisdom teeth (caused by modern diets and a narrowing of faces), abnormal connections of two or more bones in feet, and the growing prevalence of spina bifida occulta (a small gap in one or more vertebrae). By 2099, the persistent median artery is no longer considered a variant, but a "normal" human structure. The extra blood vessel poses no health risk and actually provides a small benefit by increasing overall blood supply. It can also be used as a replacement in surgical procedures for other parts of the body. The average employee works less than 20 hours per week. In the early 1800s, most people in America and Europe worked shifts of 70-90 hours per week, or even longer during busy periods. Conditions were often cramped and dangerous, pay was low (or non-existent in the case of slaves) and workers had little in the way of rights. Later in the 19th century, a growing labour movement led to improved workplace regulations. A series of laws were introduced to improve safety and to limit the hours worked by employees – particularly women and children – while slavery was abolished throughout much of the world. Further progress was made in the 20th century with the introduction of minimum wage laws, the emergence of five-day workweeks and continued growth in union membership. In 1900, a typical U.S. citizen worked 57 hours per week, and this had fallen to 49 hours by the 1920s. Working hours fell sharply during the Great Depression, especially in manufacturing, but rebounded during World War II. In the post-war boom years, the average length of workweek in the U.S. continued to fall, but at a slower rate than before, hovering at around 40 hours. There was a faster decline in Western Europe, however, and in the OECD as a whole. On both sides of the Atlantic, union membership declined substantially in the 1970s and 80s, though progress continued to be made in employment law. In the 21st century, these trends continued. In the first few decades, this occurred alongside further changes in the workforce, with telecommuting and flexible business hours making work in developed nations more dependent on worker preference and efficient productivity. In addition, this period witnessed the gradual loss of traditional labour as computer intelligence and automation proliferated. This led to serious disruption as employers attempted to adapt productivity to the growing surplus of workers. The growth of personal manufacturing in the form of 3D printers and nanofabricators, alongside increasingly common means of local power generation, began to significantly alter the economy itself. As production became more and more decentralised, work became less and less of a requirement for basic living. Spending on necessities like food at home, cars, clothing, household furnishings and utilities – as a share of disposable functional income – had already declined from 55% in 1950, to under 35% by the 2010s. With such items becoming producible on a personal and community scale, work hours in many places were gradually becoming a matter of choice rather than need. This revolution in manufacturing, combined with exponential growth of computer intelligence, would eventually change the nature of work itself. With an ever-growing share of the economy based on information technology, the average job was becoming more creative, personal, and intellectual. By the latter half of the 21st century, artificial general intelligence had penetrated much of the business world, allowing workers to share tasks with computers able to operate with little or no human intervention. Finally, a gradual cultural shift – in which more value was placed on free time and creative pursuits rather than work or material gain – began to emerge during the last decades of the century. This grew largely out of the global response to climate change, but was also a consequence of technological advancement and the mounting costs of unchecked materialism. While by no means a rapid or ubiquitous trend, this also helped to reduce the need for traditional working jobs. All of these factors contributed to an ongoing net reduction in global working hours. By the 2040s, the average workweek had fallen below 30 hours. This trend continued, falling below 20 hours during the closing years of the 21st century.

2100: Human intelligence is being vastly amplified by AI. Ubiquitous, large-scale automation has led to vast swathes of human employees being replaced by virtual or robotic counterparts. Strong AI now occupies almost every level of business, government, the military, manufacturing, and service sectors. Rather than being separate entities, these AIs are often merged with human minds, greatly extending the latter's capability. For instance, knowledge and skills can now be downloaded and stored directly within the brain. As well as basic information and data, this includes many physical abilities. A person can learn self-defence, for example, become an expert in any sport, or be taught to operate a new vehicle, all within a matter of seconds. The world has been transformed by this fusion of people and machines. The vastly greater power of AI means that it has become, simultaneously, both master and servant to the human race. The benefits of this human-AI merger require the extensive use of implants, however – something which a significant segment of the population still refuses to accept. Compared to transhuman, these non-upgraded humans are becoming like cavemen – thousands of years behind in intellectual development. Unable to comprehend the latest technology, the world around them appears "fast" and "strange" from their increasingly limited perspective. This is creating a major division in society. Nomadic floating cities are roaming the oceans. At the dawn of the 22nd century, many of the world's cities lie partially submerged due to rising sea levels. Despite some attempts to build flood defences, even famous locations – such as New York, London, Hong Kong, Shanghai and Sydney – have been affected. With over 10% of the world's population living on coastlines, hundreds of millions have been forced to migrate. While many citizens have abandoned their homelands, a growing number have adopted a new means of living which does away with national boundaries altogether. This comes in the form of floating, artificial islands – entirely self-sufficient and able to cruise around the world indefinitely. These ships provide comfort, safety and security, in stark contrast to the upheaval and chaos experienced by many land dwellers. In addition to a continuous supply of food and freshwater, various facilities are available including virtual reality suites, state-of-the-art android servants/companions, swimming pools, landing pads for anti-grav vehicles and much more. Carefully maintained arboretums with real trees can also be found on board (flora which is becoming increasingly rare these days). These giant, amphibious ships are especially popular in Southeast Asia, which has been hit hard by the effects of climate change. Some of the largest craft house upwards of 100,000+ residents. Whole new cultures are forming on these "micro nations" – often based around particular themes, lifestyles, ethics, or belief systems that appeal to a specific demographic. Sea steading in general has exploded in recent decades. In addition to city ships, permanent settlements have appeared along the flooded coasts of many regions. This often takes the form of recovered infrastructure rebuilt to accommodate rising sea levels. In the more prosperous nations, cities may be rebuilt using massive, anchored pontoons or other hydrostatic devices. More commonly, entirely new cities are devised by governments to accommodate the displaced populations of coastal cities. New, larger, and more advanced versions of the Energy Islands built in earlier decades make up the majority of these settlements. Some consist of huge artificial archipelagos, stretching for tens of kilometres. Units are often covered in natural plant life, in addition to hi-tech systems for carbon sequestration. As well as CO2 capture, offshore settlements play a role in scrubbing general air and water pollution, acting as giant filters that remove trash and chemicals from the ocean. These materials can then be recycled and put to new use. This is now having a significant impact in reversing the enormous damage that has accumulated over the centuries from ocean acidification, plastic debris, nitrogen, and other man-made waste products. Needless to say, these settlements, both stationary and roaming, are entirely carbon neutral. Power is produced from a combination of OTEC plants, offshore wind farms, tidal and wave plants, solar arrays, and other means. Some even utilise fusion. Food is grown and water desalinated locally. These ocean settlements are themselves among the earliest adopters of the so-called "post-growth economy". This had emerged out of the converged crises of resource depletion and advanced automation that began during the mid-late 21st century and seeks to minimise the impact of human economic activity on the environment. Emperor Penguins face extinction. For centuries, Emperor Penguins were the best-loved and most recognised symbol of Antarctica. By the early 22nd century, their numbers have dwindled to almost nothing because of melting sea ice, depletion of krill and industrial activity. Small populations continue to exist, by adapting their breeding habits, but even these will eventually disappear. Terraforming of Mars is underway. With space travel becoming low cost (now just a few cents per kilogram of payload), and journeys between planets now relatively routine, serious plans are underway for the transformation of Mars, with the ultimate goal of making it habitable for humans. Exactly who should be given control of Mars and its resources – or if the planet should have independence – is the subject of much debate and speculation around this time.

2110: Room-temperature superconductors are in widespread use. By the early 22nd century, room-temperature superconductors are embedded in myriad applications and have transformed much of the world's infrastructure and road networks. Just some of the revolutionary advancements include lossless energy transfer, better containment of fusion energy, improved imaging for medical scans, and a variety of new hovering or flying vehicles that can glide effortlessly over the ground. The discovery of superconductivity in 1911 revealed a set of physical properties observed in certain materials where electrical resistance vanishes at close to absolute zero. A further breakthrough in 1933 led to discovery of the Meissner effect – the ejection of magnetic field lines from the interior of the superconductor during its transition into a superconducting state, which occurs when the material is cooled by liquid nitrogen to −203°C (−334°F) and levitates a magnet. Initially, scientists knew of only a few metals with vanishing electrical resistance at just above absolute zero, or −273°C (−460°F). In the 1980s, however, researchers discovered ceramic materials displaying this phenomenon above 35 K (−238°C, or –397°F). Further progress with ceramics in the 1990s demonstrated critical temperatures reaching above 150 K (−123°C, or –190°F), a substantial jump. In the early 21st century, incremental improvements occurred with various other materials, but all required tremendously high pressures comparable to the conditions in Earth's outer core. Researchers finally achieved the "holy grail" of room-temperature superconductivity in 2020, with a compound at 15°C (59°F) using a diamond anvil cell at 269 gigapascals (GPa). In subsequent years and decades, research teams shifted their focus away from higher temperatures and onto efforts to reduce the immense pressures required for superconductivity. New techniques emerged for scaling up materials – from the nanoscale to the microscale and larger. Eventually it became possible to combine a room temperature regime with materials both visible to the naked eye and stable at relatively low pressures. Later in the 21st century, some of the world's most powerful artificial intelligences made further discoveries, with even lower pressures. Ultimately, these stable states matched the Earth's atmosphere at sea level. The next critical step involved the perfection of mass production methods for these new compounds, via the ultra-precise arrangement of nanotechnology. A shift from the laboratory and into practical applications then occurred – once again managed and deployed by AI in the most efficient ways possible. In factories and other facilities, 3D printing enabled these superconductors to coalesce in a blur of speed; one of the Singularity-like effects to be witnessed during this time. Following the discovery of superconductivity and the Meissner effect, it took a century for the first room-temperature superconductor to emerge. Now, after a further hundred years of research and development, the practical applications are clear to see. In 2110, the world is being transformed by new devices and components able to function without electrical resistance and with expulsion of magnetic field lines at room temperatures. In a city of today, it is common to witness floating cars, pods and other vehicles gliding smoothly through the air. These float over a cushion of magnetism and are powered by wireless energy transmitted from pads embedded in the ground. Outside a building, you might come across the surreal sight of a parked vehicle, hanging stationary in the air. Even the building itself may incorporate structures, signs or architectural elements that appear to have nothing below them. These hovering vehicles have a number of advantages over traditional wheeled transport. By adjusting their altitude when near pedestrians, they can simply drift above them – eliminating the possibility of accidents. This also reduces the incidence of roadkill, which had been responsible for millions of animal deaths per day during the 20th and 21st centuries. The lack of surface contact also eliminates the problem of tyre wear and therefore reduces both air and microplastic pollution produced from vehicles. Although wheels are still common in transport, they are rapidly being supplanted by superconducting technology, as these benefits are increasingly recognised by city authorities and the required infrastructure is expanded. Some of the wealthier and more hi-tech districts have already upgraded their entire road networks to cater for levitating vehicles. As more and more routes become available, being able to travel in three dimensions rather than two enables faster journey times. Combined with AI for traffic management, congestion is virtually eliminated. Abundant energy is available for these autonomous flying vehicles, with 100% of the world's electricity now supplied by ultra-efficient clean tech, and multiple redundances are built in to ensure they stay aloft. Room-temperature superconductors are transforming numerous other areas. Lossless power transmission is now possible, for example – making obsolete the traditional infrastructure for converting low and high voltage AC and enabling perfect transmission over huge distances. Energy storage is being revolutionised too as battery degradation is no longer a problem, with superconducting wires instead capturing and storing electricity indefinitely. Computers, tablets, and other electronics can be made to run cooler, more efficiently, and with far less energy consumption. Other developments include super-powerful and ultra-compact motors, along with machines that once required entire buildings or rooms to operate being viable on much smaller scales. Compact nuclear fusion is now emerging, for example, which is especially useful in space travel. Large-scale science facilities such as particle accelerators now need less energy and capital costs, while high-end medical imaging is more efficient and available in smaller form factors. Personal health pods are common in homes. In the early 22nd century, many functions previously performed in clinical settings can be automated and supplied to patients at home. Full body scanners providing a wide range of diagnoses and treatments are now a common household appliance, relieving the burden on hospitals. These devices come in a variety of form factors, but typically consist of a cylindrical capsule about two metres in size. The occupant either stands (in the case of vertical models) or lies down (in a horizontal configuration) for the procedure, which takes a matter of seconds. Cameras with sub-nanometre precision obtain images at trillions of frames per second, panning from head to toe while tracking and adjusting for even the slightest movement. Every region of the body undergoes real-time 3D analysis and is "pinged" for any high-risk changes or abnormalities since the previous scan, to determine spots that need further attention. A summary is then provided to the user, ranked in order of severity. For simple or benign problems, the machine can recommend a drug or other medication. For issues requiring surgery, treatment can be provided by robotic arms/tools, lasers, or nanorobots injected and then guided via a combination of magnets and their own tiny motors. For a transhuman individual, who may already have extensive implants and upgrades, many of these remedies may be unnecessary. While the medical capabilities of 2110 are vastly improved when compared to a hundred years previously, not every aspect of biology is fully understood yet. Certain rare and unusual conditions, for example, continue to persist in the population and require more specialist intervention than these home-based machines can provide. For the most part, however, treatment of once life-threatening illnesses is now relatively routine. In subsequent decades, a further proliferation of these health pods in tandem with new advances in science leads to cancer mortality being largely eliminated in many countries. Force fields are in military use. A combination of several unique technologies, stacked together in layers, has led to a radical new form of protective shielding. To observers from the previous century, this would resemble the "force fields" depicted in science fiction movies. When activated, it provides an instant, near-impenetrable field withstanding hits from all but the most powerful weaponry. The outer layer consists of a supercharged plasma window, shaped into a dome or sphere by electromagnetic fields. This is hot enough to vaporise most incoming metals. A secondary layer underneath contains millions of curved laser beams, producing a high-energy web that captures projectiles fast or powerful enough to bypass the plasma window. A third layer consists of a "lattice" made from trillions of carbon nanotubes. These microscopic structures are woven together in an instant, forming a diamond-hard shell repelling objects missed by the other two layers. If necessary, this can be extended to cover a larger perimeter, at the cost of decreased strength. Conversely, it can be reduced in size to provide an even denser and more durable barrier. The layers described above can protect against the majority of bullets, bombs, and projectiles. However, they are almost useless against lasers. A fourth and final layer takes care of this problem. This uses photochromatic particles, which change their properties when exposed to laser light, effectively neutralising most directed-energy weapons. An early form of this technology was seen a century previously, with sunglasses that changed colour when exposed to sunlight. In addition to warzones, these multilayered force fields are used in a range of other situations. National borders, for example, are more secure – as are many sources of food and water production. Corporate spaces and luxury dwellings owned by the rich are also utilising them. A number of satellites are being fitted with this technology too. Large-scale arcologies are emerging as an alternative to traditional cities. The global convergence of environmental issues and resource depletion has forced humanity to drastically readdress the way urban areas are designed. The refugee crisis that emerged in the mid-21st century has now largely subsided, with much of civilization having been relocated to the polar regions of Northern Europe, Russia, Canada, and Western Antarctica. In order to accommodate so many people in such a smaller area, cities have become increasingly dense and self-contained. However, decades of concerted geoengineering efforts have led to success in stabilising global temperatures. Combined with ongoing population pressures, this has prompted governments to begin repopulating some of the abandoned regions in more central latitudes. Despite this progress, most countries still face the problems of resettling hyper-arid, ecologically ravaged environments. As such, long hypothesised "arcologies" have begun to emerge as a radical departure from traditional urbanism, condensing an entire city into one massive structure. A precedent for these mega-structures could be seen as far back as the 2020s, with construction of the first centrally planned, truly sustainable cities. Later in the 21st century, these principles were adapted for the development of single structures – resulting in supertall skyscrapers that combined vertical farming with residential and commercial space, recycling and production systems for energy, water, and other resources. By the 22nd century, these towers have evolved into some of the mightiest structures ever built of such immense volume that some cover several kilometres in girth, typically rise over 1.5 kilometres in height and accommodate millions of people. Some are partially or fully merged into mountainsides and other landscapes – resembling enormous ant colonies, and living up to their portmanteau of "architecture" and "ecology". This scale of engineering has been made possible through advances in materials science, with carbon nanotubes utilised to cope with the massive forces involved. The sheer size and strength of arcologies makes them virtually immune to earthquakes, hurricanes, and other disasters. Each of these self-contained structures holds everything it needs for human survival. Automation is ubiquitous with intelligent robots managing almost all construction and maintenance. Highly efficient transport systems are located throughout to move travellers horizontally, vertically or diagonally. Advancements in elevator technology have made lifts capable of whisking riders up in a single trip – no matter what height – as opposed to changing halfway up. This has been accomplished through improved cable design and, more recently, the use of electromagnetic propulsion. This kind of hyper-dense urban environment allows movement around a city at speeds unheard of in previous centuries. These radical new designs exemplify an overall trend in recent human development: low environmental impact. Globally, cities and their connecting infrastructure are slowly being retracted, giving over more land to nature. Advances in transportation and civil engineering, combined with nano-scale manufacturing, are enabling humans to operate with little or no impact on the environment. Though classically designed cities still exist, the arcology represents a fundamental shift in the balance between humans and nature. Femtoengineering is practical. Technology on the scale of quadrillionths of a metre (10-15) has recently emerged. This is three orders of magnitude smaller than Pico technology and six orders of magnitude smaller than nanotechnology. Engineering at this scale involves working directly with the finest known structures of matter – such as quarks and strings – to manipulate the properties of atoms. This development is a further step towards macro-scale teleportation, i.e., transportation of objects visible to the naked eye. Significant breakthroughs in anti-gravity and force field generation will also result from this. Another area that will see major progress is in materials technology. For example, metals will be produced which are capable of withstanding truly enormous pressures and tensile forces. The applications for this will be endless, but perhaps one of the most exciting areas will be in the exploration of hostile environments – such as probes capable of travelling within the Sun itself, and tunnelling machines that can penetrate the Earth's crust into the layers of magma beneath. Longer term, this development will pave the way for interstellar ships and the massive forces involved in lightspeed travel. Other more exotic materials are becoming possible – including wholly transparent metals, highly luminous metals, frictionless surfaces, and ultra-dense but extremely lightweight structures. As with many areas of science, femtoengineering is being guided by advanced AI, which is now trillions of times more powerful than unaided human intelligence. Man-made control of earthquakes and tsunamis. By now, geophysicists have mapped the entirety of the Earth's crust and its faults, extending some 50 km (30 mi) below the surface. Computer simulations can forecast exactly when and where an earthquake will occur and its precise magnitude. With a "scheduling" system now in place, comprehensive preventative measures can be taken against these disasters. For instance, people know when to stay out of the weakest buildings, away from the bridges most likely to collapse and otherwise away from anything that might harm them. Rescue and repair workers can be on duty, with vacations cancelled and extra workers brought in from other areas. Workers can be geared up with extra equipment ordered in advance to fix key structures that may fail in an earthquake. Freeways can be emptied. Dangerous chemical freight can be prevented from passing through populated areas during the quake. Aircraft can be stopped from approaching a potentially damaged runway. Weak water reservoirs can have their water levels lowered in advance. Tourists can be made to stay away. All of these measures can substantially reduce casualties and economic disruption. However, some nations are going one step further and creating additional systems, in the form of gigantic engineering projects. To protect the most earthquake-prone regions, a network of "lubrication wells" is being established. These man-made channels penetrate deep underground, to the very edge of the mantle. They work by injecting nanotechnology-based fluid or gel into fault lines, making it easier for rock layers to slide past each other. Explosive charges can also be dropped at strategic points, in zones where the lubrication might be less effective. Instead of sudden, huge earthquakes, the network induces a series of much smaller earthquakes. Using this method, an earthquake of magnitude 8.0 can be buffered down to magnitude 4.0 or lower, causing little or no damage to structures on the surface. In coastal locations, tsunamis can also be prevented. This is a carefully controlled process – requiring heavy use of AI – and is by no means perfect. There are complex legal and liability issues in the event of accidents. For instance, damage from human-induced earthquakes cannot be excused as an "act of God." Despite these technical and legal hurdles, it would seem that mankind is gaining the power to control even the most destructive aspects of nature. Our solar system is passing through a million degree cloud of gas. The Sun is approaching a boundary between the Local Cloud of interstellar gas and another cloud of extremely turbulent gas – the latter is the remnants of supernova explosions that occurred millions of years ago. The density of this medium is sufficiently low to pose no threat to Earth or any other planets. The heliosphere is reformed slightly, and the level of cosmic radiation entering the magnetosphere increases, but nothing more. However, spacecraft and satellites may be damaged by these high energy particles unless they are upgraded.

2120: Mind uploading enters mainstream society: Adequate hardware to support human-level intelligence was available as far back as the 2020s, thanks to the exponential progress of Moore's Law. This made it possible to form simulations of neural processes. However, the underlying software foundation required for mind uploading proved to be a vastly greater challenge. Full transfer of human consciousness into artificial substrates posed enormous technical difficulties, in addition to raising ethical and philosophical issues. The sheer complexity of the brain, and its inherent fragility – along with the many legislative barriers that stood in the way – meant that it was nearly a century before such technology reached the mainstream. Some breakthroughs occurred in the latter decades of the 21st century, with partial transfer of memories and thought patterns, allowing some limited experience of the mind uploading process. However, it was only through the emergence of Pico technology and strong AI that sufficiently detailed scanning methods became available. This new generation of machines, being orders of magnitude faster and more robust, finally bridged the gap between organic human brains and their synthetic equivalents. Initially tested on monkeys, the procedure was eventually offered to certain marginalised people including death row inmates and terminally ill patients. Once it could be demonstrated as being safe and reversible, the project garnered a steady stream of free and healthy volunteers, tempted by this new form of computerised immortality. Years of red tape and legislation followed, including some of the strictest regulations ever enacted into law. Religious and conservative groups voiced their objections to what they saw as a fundamental violation of God's will. At times, this threatened to postpone the technology indefinitely. Eventually though, like so many other breakthroughs in science, the zeitgeist moved on. The level of demand for mind uploading proved to be enormous, and the treatment became widely available in the 2120s. Today, citizens have access to special clinics in which their biological brains can be literally discarded in favour of artificial ones. Rather than simply "duplicating" a mind, the machine physically shifts the consciousness, like a sponge soaking up water. The brain is gradually replaced – piece by piece – so the original personality remains intact during the transition. This vital aspect of the procedure assuages the fear which many have of losing their identity. For the wealthiest individuals, entire new bodies can be grown, into which the synthetic brains can be transplanted. These bodies may themselves be artificial, with options for partially cyborg or fully robotic replacements. Externally, they are often indistinguishable from real human bodies, but include many hi-tech add-ons and internal features boosting physical and mental abilities. Not everyone is opting for these types of treatments, however. A significant percentage view them with extreme suspicion, as though somehow immoral and dehumanising. With each passing year, society is becoming increasingly fractured, with an ever-widening divide between those who seek to enhance themselves, and those who prefer to eschew such technology.

2130: Large-scale civilian settlement of the Moon is underway. As a result of various new space elevators, huge numbers of Earth's citizens now have rapid, affordable, and safe access to space. Dozens of permanent Moon colonies have been established. Nanotechnology self-assemblers enable these habitats to be constructed in a matter of hours or days. Most are concentrated in the southern polar region, which has greater access to water. Advances in genetic engineering mean that humans can be fully adapted to the gravity of the Moon. In any case, scientists are developing a form of artificial gravity that will soon become available. In addition to basic exploration and surveying, the main occupations for colonists at the moment are scientific and technological research. Almost all manual/physical tasks are handled by robots, giving more leisure time for the human residents. Tourism is now a booming industry, with many thousands of people arriving on the Moon's surface each year for guided tours, even though VR simulations can recreate the Moon's environment in perfect detail. The most popular destinations are Mons Huygens (the highest mountain), Tycho (a prominent crater visible from Earth) and the Apollo landing sites. A very large telescope is also operational, for long-distance astronomical observations. The lack of atmosphere and other conditions gives it a tremendous advantage over Earth-based telescopes.

2140: A North American Union is taking shape. The 21st century witnessed a dramatic rebalancing of America's power, with much of its influence being lost to China and India. However, there were also developments closer to home, with a remodelling of the relationship to her neighbours. A gradual stagnation of the white population, and simultaneous growth of Hispanics, offered the first hints of what lay ahead. This trend would continue long into the future, with Latin American immigrants eventually dominating the southwestern states and Mexico becoming a fully developed nation. Alongside this, Canada began to experience a population and economic surge almost unparalleled in its history. Soaring global temperatures were providing access to a treasure trove of natural resources, previously locked up in the frozen north – even as the US was being ravaged by drought, flooding, wildfires, and other adverse conditions. With Canada's environment now vastly more favourable, newcomers flocked in their millions to its cheap, wide, green lands. After decades of further homogenisation and cultural interchange between each of the three nations, US power has continued to wane, both at home and abroad. Meanwhile, national borders are becoming increasingly irrelevant in the world at large. Ongoing globalisation, the birth of a single world currency, the dominance of AI in government, a defection of citizens to online "virtual states" (making physical territories less important), and other technological advances have all contributed to this. Europe has already formed its own superstate, while parts of Asia are now converging too. In light of all this, the USA begins talks with Canada regarding a North American Union. With a more globalised, supranational sentiment emerging, they are gradually unified under a single political system – strengthening the power and influence of both. Mexico eventually joins too. In later decades, further expansion of the union occurs with even Cuba, the Dominican Republic and other parts of the Caribbean seeing integration. By the end of the 22nd century, the whole of North and South America has joined to become the "American Union", paving the way for a truly united world government in the 23rd century. "Perfect" simulations of one cubic metre. In the early 21st century, supercomputers used a simulation technique called lattice quantum chromodynamics, performing calculations by essentially dividing space-time into a four-dimensional grid. With a resolution based on the fundamental physical laws, they could simulate only a tiny portion of the universe accurately – on the scale of one 100-trillionth of a metre, slightly larger than the nucleus of an atom. At best, algorithms were able to demonstrate the strong nuclear force among protons and neutrons and its effect on nuclei and their interactions. This was achieved in femto-sized universes where the space-time continuum was replaced by a lattice, with spatial and temporal sizes on the order of several femto-metres or fermis and whose lattice spacings (pixelations) were fractions of fermis. Lattice gauge theory revealed new insights into the nature of matter but was still fairly limited in scope. However, computer power and information technology in general were growing exponentially. In fact, they had followed a remarkably smooth and predictable trend throughout the 20th century. This growth rate continued its consistent path in the 21st and 22nd centuries. By 2140, a region of space measuring 1 cubic metre can be simulated in near-perfect detail, down to the smallest quantum unit. This landmark in physics has profound applications. It soon paves the way for larger simulations of two metres, providing absolutely accurate representations of the entire human body. Scientific experiments on these and similar-sized objects can now be literally as controlled as they can be – with data obtained far more reliably and much faster than in real-world and real-time settings. Holodeck-style environments become possible in the latter half of this century, as these simulations continue to increase in detail and spatial extent, reaching tens of metres and greater. This offers a level of realism that was unavailable with full-immersion virtual reality. To an observer placed in these miniature universes, it would be almost impossible to distinguish reality from fantasy.

2150: Interstellar exploration is becoming common. By the mid-22nd century, a wide variety of probes to neighbouring star systems have successfully reached their destinations. The fastest of these can now achieve a significant fraction of light speed, requiring only a few decades of travel time. By way of comparison, space probes of a century earlier – such as the Voyager missions – will take many thousands of years to reach the stars. A number of different engine systems are being utilised – from antimatter to nuclear pulse propulsion and other more experimental methods. Each of these craft is equipped with powerful AI, heavily automated systems, and robots/androids. Protection from incoming debris is offered by cone-shaped force fields projected from the front of each craft. This streamlined shape causes objects like asteroids to drift by without causing any damage. After journeying for trillions of miles, the majority of probes successfully rendezvous with their destinations. These return a treasure trove of data and visual information on extrasolar planets. In addition, images are taken of Earth and the Solar System viewed from light years away, providing a new perspective of humanity and its place in the universe. Holodeck-style environments are becoming possible. The concept of virtual reality had been explored as far back as the 1930s, when Stanley G. Weinbaum wrote his short story, Pygmalion's Spectacles. Published in Wonder Stories – an American science fiction magazine – this described a "mask" with holographic recording of experiences including smell, taste, and touch. In the 1950s, Morton Heilig wrote of an "Experience Theatre" that could encompass all the senses in an effective manner, thus drawing the viewer into the onscreen activity. He built a prototype mechanical device called the Sensorama in 1962 – which had five short films displayed in it while engaging multiple senses (sight, sound, smell, and touch). Around this time, engineer and inventor Douglas Engelbart began using computer screens as both input and output devices. Later in the 20th century, the term "virtual reality" was popularised by Jaron Lanier. A major pioneer of the field, he founded VPL Research in 1985, which developed and built some of the seminal "goggles and gloves" systems of that decade. A more advanced concept was depicted in TV shows like Star Trek: The Next Generation (1987-1994) and movies like The Matrix (1999). These introduced the idea of simulated realities that were convincing enough to be indistinguishable from the real world. It was not until the late 2010s that virtual reality became a truly mainstream consumer technology. By then, exponential advances in computing power had solved many issues hindering previous, cruder attempts at VR – such as cost, weight/bulkiness, pixel resolution and screen latency. It was possible to combine these headsets with circular or hexagonal treadmills, offering users the ability to walk in a seamless open world. The Internet also enabled participants from around the globe to compete and engage with each other in massively multiplayer online role-playing games. While clearly a huge improvement over earlier generations of hardware, these devices would pale into insignificance when compared to full immersion virtual reality (FIVR). As computers became ever smaller and more compact, made possible via new materials such as graphene, they were beginning to integrate with the human body in ways hitherto impossible. Their components had shrunk by orders of magnitude, following trends like Moore's Law. Machines that filled entire rooms in the 1970s had become smartphones by 2010 and the size of blood cells by the 2030s. This occurred in parallel with accurate models of the brain, establishing a basic roadmap of neurological processes. Full immersion virtual reality leveraged these advances to create microscopic devices able to record and mimic patterns of thought, directly inside the brain. Tens of billions of these "nanobots" could be programmed to function simultaneously, like a miniature Internet, the end result being that sensory information was now reproducible through software. In other words – vision, hearing, smell, taste, and touch could be temporarily substituted by a computer program, allowing users to experience virtual environments with sufficient detail to match the real world. First demonstrated in laboratory settings and military training environments, FIVR was commercialised in subsequent decades and became one of the 21st century's defining technologies. Not everyone was amenable to having nanoscale machines inserted into their brains, however. In any case, full immersion VR provided only a superficial imitation of real life – it could not replicate every subatomic particle, for example, or the countless quantum events occurring at any given moment in time and space. Accounting for these phenomena would require a level of computing on a different scale entirely. Lattice Quantum Chromodynamics (LQCD) was a promising field in the late 20th and early 21st centuries. This allowed researchers to simulate objects and processes in near-perfect detail, using resolutions based on the fundamental physical laws. By the 2010s, for example, individual proton masses could be determined at error margins close to one percent. During the 2020s, exascale computing helped to further refine the nuclear forces and uncover exotic "new physics" beyond the Standard Model. Smaller and smaller pixelations were being applied to greater and greater volumes of space-time, as supercomputers later reached the zetta scale, yottascale and beyond. By the 2070s, it was possible to simulate a complete virus with absolute accuracy down to the smallest known quantum level. Blood cells, bacteria and other living structures followed as this technique approached the macroscale. In the early 22nd century, mind transfer became feasible for mainstream use, whole brain scans now sufficiently perfected. Another milestone was passed by 2140, with a cubic metre of space-time being accurately simulated. These four-dimensional lattice grids were, in effect, miniature universes – fully programmable and controllable. When combined with artificial intelligence, matter contained within their boundaries could be used to recreate virtually anything in real time and real resolution. Spatial extents continued to grow, reaching tens of metres. Although highly convincing VR had been around for over a century, achieving this level of detail at these scales had been impossible until now. By 2150, perfect simulations can be generated in room-sized environments without any requirement for on-person hardware. As virtual reality advances still further, entire worlds are constructed using the smallest quantum units for building blocks. This opens up some profound opportunities in the 23rd century. For example, artificial planet Earths can have their parameters altered slightly – gravity, mass, temperature and so on – then fast-forwarded billions of years to compare the outcomes. Intelligent species evolving on these virtual worlds may be entirely unaware that they are part of a giant simulation. Hi-tech, automated cities. An observer from the previous century, walking through a newly developed city of 2150, would be struck by the sense of cleanliness and order. The air would smell fresh and pure, as if they were in pre-industrial countryside. Roads and pavements would be immaculate: made of special materials that cleaned themselves, absorbed garbage and could self-repair in the event of damage. Building surfaces, windows and roofs would be completely resistant to dirt, bacteria, weather, graffiti, and vandalism. These same coatings would be applied to public transport, cars, and other vehicles. Everything would appear brand new, shiny and in perfect condition at all times. Greenery would feature heavily in this city, alongside spectacular fountains, sculptures, and another beautification. Lamp posts, telegraph poles, signs, bollards, and other visual "clutter" that once festooned the streets have disappeared. Lighting is achieved more discretely, using a combination of self-illuminating walls and surfaces, antigravity and other features designed to hide these eyesores, maximising pedestrian space and aesthetics. Electricity is passed wirelessly from building to building. Room temperature superconductors – having begun to go mainstream in previous decades – are now ubiquitous in every major city. These allow the rapid movement of vehicles without the need for tracks, wheels, overhead cables, or other components. Cars and trains simply drift along silently, riding on electromagnetic currents. Signposts are obsolete – all information is beamed into a person's visual cortex. They merely have to "think" of a particular building, street, or route to be given information about it. This observer would also notice their increased personal space, and the relative quiet of areas that, in earlier times, would have bustled with cars, people and movement. In some places, robots tending to manual duties might outnumber humans. This is partly a result of the reduction in the world's population. However, it is also because citizens of today spend the majority of their time in virtual environments. These offer practically everything a person needs in terms of knowledge, communication, and interaction – often at speeds much greater than real time. Limited only by a person's imagination, they can provide richer and more stimulating experiences than just about anything in the physical world. On those rare occasions when a person ventures outside, they are likely to spend little time on foot. Almost all services and material needs can be obtained within the home, or practically on their doorstep – whether it be food, medical assistance, or even replacement body parts and physical upgrades. Social gatherings in the real world are infrequent, usually reserved for "special" occasions such as funerals, for novelty value, or the small number of situations where VR is impractical. Crime is almost non-existent in these hi-tech cities. Surveillance is everywhere: recording every footstep of your journey in perfect detail and identifying who you are, from the moment you enter a public area. Even your internal biological state can be monitored – such as neural activity and pulse – giving clues as to your immediate intentions. Police can be summoned instantly, with robotic officers appearing to 'grow' out of the ground through the use of blended claytronic and nanobots, embedded into the buildings and roads. This is so much faster and more efficient that in most cities, having law enforcement drive or fly to a crime area (in physical vehicles) has become obsolete. Although safe and clean, some of these hi-tech districts might appear rather sterile to an observer from the previous century. They would lack the grit, noise and character which defined cities in past times. One way that urban designers are overcoming this problem is through the use of dynamic surfaces. These create physical environments that are interactive. Certain building façades, for instance, can change their appearance to match the tastes of the observer. This can be achieved via augmented reality (which only the individual is aware of), claytronic surfaces and holographic projections (which everybody can see), or a combination of the two. A bland glass and steel building could suddenly morph into a classical style, with Corinthian columns and marble floors; or it could change to a red brick texture, depending on the mood or situation.

2151: Total solar eclipse in London. A rare total eclipse takes place in Britain this year, with parts of London experiencing totality. The last time this occurred was in 1715; the next will be in 2600 AD.

2160: The world's first bicentenaries. Certain people who were born in the 1960s are still alive and well in today's world. Life expectancy had been increasing at a rate of 0.2 years, per year, at the turn of the 21st century. This incremental progress meant that by the time they were 80, these people could expect to live an additional decade on top of their original lifespan. However, the rate of increase itself had been accelerating, due to major breakthroughs in medicine and healthcare, combined with better education and lifestyle choices. This created a "steppingstone", allowing people to buy time for the treatments available later in the century – which included being able to halt the aging process altogether.

2170: Underground lunar cities are widespread. In the second half of the 21st century, the emergence of low-cost space travel made it relatively easy to access the Moon and its surface. This led to a surge of new explorers, entrepreneurs and cooperatives looking to make their mark on its undeveloped territories. Now, almost a hundred years after these early settlers, the lunar environment has become a hive of activity. Lava tunnels – among the most sought-after locations – had formed in the ancient past by molten rock flowing underground, either from volcanic activity or the impact of comets and asteroids causing terrain to melt. These left behind enormous caves, with some of the largest found to be over 100 kilometres in length and several kilometres wide. Telescopic observations, orbiting probes, and landers in the 20th and 21st centuries revealed more and more information about these regions. Eventually it became possible to send fleets of automated drones and other vehicles below the surface – producing detailed 3D maps with imaging techniques such as LiDAR and muon tomography. The latter had been used with great success on Earth; to uncover hidden chambers in the Egyptian Pyramids, for example, and to detect nuclear material in vehicles and cargo containers for the purposes of non-proliferation; and to monitor potential underground sites used for carbon sequestration. Its usage on the Moon allowed deep scanning of tunnels at high spatial resolution. By 2100, the Moon's surface and subsurface had been thoroughly mapped, with a large and growing number of commercial operations being established to secure prime real estate. The popularity of these underground sites owed in part to their shielding from radiation, micrometeorites and temperature extremes. Sealing of the "skylight" openings with an inflatable module, designed to form a hard outer shell, provided a further layer of protection – and the potential to create a pressurised, breathable atmosphere. In addition to establishing a safe environment, a number of useful resources could be extracted from these depths. Titanium, for example, appeared in concentrations of 10% or more, whereas the highest yields on Earth rarely exceeded 3%. The tunnels also contained rare minerals, which had formed as the lava slowly cooled and differentiated. In the polar regions, some tunnels led to and provided easier access to frozen water deposits. As the decades passed, the expansion of underground colonies began to accelerate. The initial settlements, containing the bare essentials in terms of food, water, oxygen production and habitat modules, evolved into towns with their own culture and identity. Structural engineers, having assessed the arch-like roofs overhead, found them to be stable even at widths of several kilometres. On Earth, lava tubes were unable to form at such enormous sizes, but the Moon's lower gravity (0.16 g) and lack of weathering or erosion made this possible. The Moon began to attract more and more residents, seeking a life away from Earth and the chance to form a new society. As a response to the environmental crisis and humanity's ever-growing footprint, a "deindustrialisation" movement had been gathering pace. This aimed to reduce the burden on Earth's ecosystems by "offworlding" many traditional production/extraction/manufacturing operations to the lunar environment. By 2170 – two hundred years since humanity first set foot on the Moon – these lava tunnels are filled with entire cities, home to many millions of people and their robot/AI companions. Much of the infrastructure (including some very large supercomputers) has been imported from Earth, as part of the aforementioned deindustrialisation projects. Most of the original cave entrances, made from inflatable materials, have now been upgraded into fully-fledged airlocks for handling the rapid arrival and departure of large spacecraft. In some of the largest and deepest caverns, the lunar conditions allow for a number of gigantic science experiments. For instance, neutrino detectors are being built at scales and efficiencies unmatched on Earth, taking advantage of the greater isolation from background interference. These are revealing profound insights into astronomical phenomena and the nature of the Universe. Ongoing expansion of the "datasphere" – defined as the world's aggregate of generated and stored data – continues to drive the growth of computing and related technologies in the 22nd century. With more and more space required to house data centres and supercomputers, the Moon has become a major focus of activity in terms of fulfilling this need. Formerly separate cave systems are now interconnected, forming hyper-fast networks across a significant portion of the Moon's surface and subsurface.

2190: Nitrous oxide (N2O) has fallen to pre-industrial levels. Nitrous oxide (N2O) is a naturally occurring gas emitted by bacteria in the soils and oceans, forming part of the Earth's nitrogen cycle. It was first synthesised by English natural philosopher and chemist Joseph Priestley in 1772. From the Industrial Revolution onwards, human activities began to significantly increase the amount of N2O in the atmosphere. By the early 21st century, about 40% of total emissions were man-made. By far the largest anthropogenic source (80%) was from agriculture and the use of synthetic fertilizers to improve soil, as well as the breakdown of nitrogen in livestock manure and urine. Industrial sources included production of chemicals such as nylon, internal combustion engines for transport and oxidizers in rocketry. Known as "laughing gas" due to its euphoric effects, it was also used in surgery and dentistry for anaesthetics and analgesics. Nitrous oxide was found to be a powerful greenhouse gas – the third most important after carbon dioxide and methane. While not as abundant in the atmosphere as carbon dioxide (CO2), it had almost 300 times the heat-trapping ability per molecule and caused roughly 10 percent of global warming. After the banning of chlorofluorocarbons (CFCs) in the 1980s, it also became the most important substance in stratospheric ozone depletion. By the mid-21st century, the effects of global warming had become very serious. While most efforts were focussed on mitigating CO2, attempts were made to address the imbalance of other greenhouse gases, including N20. There was no "silver bullet" for this. Instead, it would take a combination of substantial improvements in agricultural efficiency, reduced emissions in transportation and industrial sectors, along with changes in dietary habits towards less per capita meat consumption in the developed world. While many technologies and innovations were already available in earlier decades, these targets were unfortunately difficult to achieve – due to additional costs and the absence of political will for implementation. It was only during the catastrophic events in the second half of the century that sufficient efforts and financial resources were directed towards the problem. With a lifespan of 114 years, man-made N20 proved difficult to stabilise and remained in the atmosphere well into the 22nd century. By 2190, it has fallen to around 270 parts per billion (ppb), its pre-industrial level. As well as halting the impact of global warming and ozone damage, other benefits include better overall air quality, reduced loss of biodiversity in eutrophied aquatic and terrestrial ecosystems, and economic benefits.

2200: Traditional employment is becoming obsolete. The average citizen today is likely to spend the vast majority of their time in a virtual reality of some kind. Physical society and culture still exist – but most eschew them, in favour of the Godlike abilities they can experience online. It is very rare to meet a friend or colleague in person now. You are far more likely to encounter a form of artificial intelligence today, than you are a living, breathing human. Urban centres have become eerily deserted, with most people to be found in their homes – or in digital libraries and entertainment venues – engaged in complex simulations that offer perfect recreations of the real world. To observers from earlier centuries, these virtual environments would appear truly dazzling in their speed and complexity, with an almost unimaginable level of detail, creativity, and ingenuity. A trend which began during the Industrial Revolution has now reached its ultimate conclusion. Working hours had gradually declined over the centuries, thanks to a combination of technology, automation, improvements in working conditions and employee rights, changing labour demands and a shift in the cultural zeitgeist. By 2050, the average person in a developed country was employed for under 30 hours per week and this fell to 20 hours by 2100. Working hours continued to fall in the 22nd century as machines – including life-like androids – took on ever more complex and sophisticated roles. As humans began to enhance their cognitive abilities, the nature of work itself was changing. More and more people were moving from "drudge" jobs into their own personal, creative, and intellectual pursuits. The line between work and play was beginning to blur. Some roles, for example, were now taking the form of extremely challenging "games", based on subjective anomalies and problems resulting from discoveries for which AI programs were unable to offer adequate explanations. Alongside this, average spending on various household items and utilities, when measured as a percentage of disposable income, was steadily declining. By 2200, this trend is complete. In most countries, basic items such as food, energy and clothing are now essentially free, with little or no need for the average person to work in order to acquire them. Recent advances in replicator technology provide an abundance of resources – eliminating famine, disease, and the need for war. Literally everything has been automated, digitised, and made easier. Take the emergency services, for example. Hospital visits are rarely required now, as practically everything a person needs in terms of treatment is available at home, or within their own body. Police forces are dominated by robots, and, in any case, physical crimes have been largely eradicated. Firefighters are no longer needed, since they are robotic too, while building regulations and nanotechnology materials can prevent most fires occurring in the first place. This process of falling employment was, of course, by no means a smooth transition. It caused profound economic and political disruption throughout the 21st and 22nd centuries. By 2200, however, the world has fully adapted to these changes and is entering a period of artistic and cultural splendour the likes of which have never been seen before. Whether as explorers in space, or designers of entire new worlds in cyberspace, humans are free to pursue their greatest dreams and personal aspirations – unshackled from the confines of traditional economic and monetary systems.

2210: A global rewilding effort is underway. Human activity in the 19th through 22nd centuries led to catastrophic damage of the natural world. Of the approximately 30 million known species of flora and fauna, more than half were lost as a result of pollution, climate change, deforestation, mining, agriculture, urban sprawl, overfishing and hunting. Extinctions on this scale had occurred only five times previously in Earth's history. Various wars, nuclear attacks, industrial accidents, and nanotechnology experiments also played a role in making large tracts of the world essentially lifeless. Permanent damage was done to countless habitats. The Amazon rainforest, perhaps the most egregious example, shrank to become mostly desert by 2100. Meanwhile, ocean acidification caused by rising CO2 levels resulted in the decimation of coral reefs. The Arctic became devoid of ice during summer months, while melting in Greenland, Iceland, West Antarctica and elsewhere led to sea level rises of nearly two metres by the 22nd century. All of this occurred despite an in-depth scientific knowledge of the processes underway. Long term sustainability and sensible management of resources were sacrificed in favour of short-term profits, political influence, and personal gain. By the time most governments began to enact serious measures, it was already too late. Biodiversity fell away to such an extent that, for those born in the late 20th century, the planet became unrecognisable. Younger generations growing up in this new world found themselves bitterly resentful at what their predecessors had allowed to happen. Many in Asia, Africa and South America would never get to experience a real forest or come face to face with animals larger than a domestic dog or witness the range of colourful and exotic species that were commonplace before – except in zoos, or virtual reality. Older members of society came to be vilified. Some nations even organised "crimes against nature" trials, leading to the conviction of former politicians and fossil fuel executives. By the 23rd century, however, technology was advancing to a whole new magnitude of power and possibilities. Super intelligent entities were now dominating business and government, formulating policies to benefit everybody rather than the few. Meanwhile, a new and gigantic system of orbital infrastructure was being planned, allowing man to directly control the Earth's climate. Consumer devices were also becoming available that could reproduce food and other items without needing to plunder natural resources. An idea began to emerge that quickly gained momentum. It would require an international, concerted effort over a number of generations, but it had support from across the political spectrum. "Pre-Holocene Rewilding" had been discussed in the past and even attempted on a small scale, but global versions lacked the necessary consensus mainly due to the costs, technical challenges, and social issues. However, the enormous wealth and prosperity now emerging on Earth – along with perfection of certain biotechnologies – meant that such a megaproject was becoming feasible. In essence, it would involve the recreation of extinct animals and plants, brought back to life by a combination of fossil records, DNA samples, computer models and molecular engineering. Once grown or reproduced in sufficient numbers, these would be distributed back to their original native environments: as close as possible to how they lived prior to human industry. They would then be managed in such a way that people could cause them no harm – and vice versa. This rewilding effort became the single largest environmental project in history. Entire deserts were transformed back into lush Eden’s, fed by artificial rain and other forms of weather control. Vast areas of abandoned wasteland became rich ecosystems teaming with life; even ancient megafauna such as mammoths. Toxic lakes and rivers were made clean. The oceans were de-acidified, cooled and made habitable once again to countless fish, molluscs, crustaceans, and other aquatic invertebrates. Urban sprawl in cities was dramatically reversed and scaled back, with a focus instead on highly compact vertical structures. Slowly, the Earth recovered. Humanity had reached an equilibrium with its surroundings. Though it would take another few decades, the final elements were falling into place to ensure the future preservation of biodiversity.

2220: Moore's Law is reaching its physical limit. One of the notable technological milestones of the early 23rd century is the perfection of traditional computer substrates, which are now becoming the densest and most efficient allowed by the laws of physics. This has been accomplished by super AI, working to create subatomic matrices at unimaginably tiny scales. Some of the ultra-compact designs now being used incorporate diamondoid-like materials and fractalized structures, with recursive properties for encoding information at levels far below even femtotechnology. A computing device with a mass of one kilogram (roughly equivalent to a conventional laptop circa 2020) can today perform a maximum of 5.425 × 1050 operations per second. This is 33 orders of magnitude greater than a state-of-the-art supercomputer of two centuries earlier. One of the major problems – which had vexed even the greatest of the super AI minds – was the issue of heat management. In early 21st century terms, generating such a vast amount of computational power would produce the equivalent of a thermonuclear explosion. However, technical considerations for this and other challenges were eventually solved, based on new physics. By 2220, this form of "computronium" has been largely perfected for mainstream use, although experiments continue with certain exotic forms of matter, attempting to eke out further gains in performance and energy efficiency. Much of the Solar System is being turned into "smart" rocks, with neural networks embedded throughout asteroids and on the surface of moons, for example.

2227: Pluto is closer to the Sun than Neptune. Pluto takes 248 Earth years to go once around the Sun. Its orbital characteristics are substantially different from other planets, which follow nearly circular orbits around the Sun close to a flat reference plane called the ecliptic. In contrast, Pluto's orbit is moderately inclined relative to the ecliptic and moderately eccentric (elliptical). This means that for periods of 20 years, it comes closer to the Sun than Neptune. The last time this occurred was from 1979 to 1999; it happens again between 2227 and 2247. As Pluto moves closer to the Sun, ice on its surface begins to warm slightly and sublime ("evaporate" from solid to gas, without passing through an intermediate liquid phase). This forms a thin atmosphere that consists mostly of nitrogen (N2), methane (CH4), and carbon monoxide (CO). At between 6.5 and 24 μbar, it is roughly one million to 100,000 times less than Earth's atmospheric pressure. As it moves away from the Sun, the gases cool, and the atmosphere begins to refreeze again, eventually disappearing completely as temperatures reach a low of -240°C (-400°F). It is now three centuries since the dwarf planet's discovery in 1930. Both Pluto and its five moons have been thoroughly explored and studied in complete detail – the focus has shifted from scientific analysis to resource exploitation. While lacking the precious metals and minerals found in the asteroid fields and elsewhere, Pluto nevertheless holds useful quantities of frozen nitrogen and water. In addition to surface operations, the interior of the world is being transformed with large-scale automation and robotics drilling down. Despite Pluto's orbit appearing to cross that of Neptune when viewed from directly above, the two objects' orbits are aligned so that they can never collide or even approach closely. They are always separated by at least 17 AU. The next time Pluto moves closer to the Sun than Neptune will be in the year 2475.

2230: Antimatter-fuelled starships. The fastest of today's spacecraft are now capable of sustained travel at between 0.9 and 0.99c (90-99% lightspeed). This is fast enough to reach nearby stars within relatively short timeframes. One of the more common ship designs is a "ring" containing matter-antimatter fuel, purposefully collided to release vast amounts of energy for thrust. This energy is also used to maintain stability and project fields around the craft, protecting it from meteoroids and other incoming hazards. Many deep-space missions are now underway, including trips to Earth-like planets within 100 light years. Most of these ships are unmanned, but a small percentage contain human pilots. These are invariably transhuman with heavily modified bodies and brains, more able to cope with long journeys than natural, unaided humans.

2240: Christianity is fading from American culture. After centuries of decline, Christianity is on the verge of disappearing from American culture. The vast majority of the US population is now atheist or agnostic. This same trend was witnessed in Europe at a far earlier date. However, religion was so deeply embedded in the American psyche that it took substantially longer to reach this stage.

2250: Humanity is a Type 1 civilisation on the Kardashev scale. By this date, virtually all of the Earth's natural energy is being captured and harnessed in some way. Vast swathes of land, sea and atmosphere have been transformed into a series of enormous power grids using wind, solar, hydroelectric, and geothermal technologies. These are supplemented by fusion and antimatter, along with wholly new forms of energy production that were unknown to scientists in previous centuries. In Earth orbit, a comprehensive network of stations is now in place. This cluster is organised in such a way that it harvests every ounce of incoming solar radiation being reflected back into space. In terms of raw electrical power, the total converted energy is equivalent to 52 petawatts (PW). Each vessel acts as a node within a gigantic web, completely encircling the planet. The nodes produce attractive forces between each other, forming an invisible "shield" absorbing solar radiation from literally the entire globe. This technology has the added benefit of stabilising the Earth's climate, since the network can be adjusted at certain points to control the amount of heat getting through. These variations allow rain to fall whenever and wherever necessary, while hurricanes and other such phenomena are easily controlled. The vast amount of energy now available to humanity is creating enormous wealth and prosperity. Earth's physical infrastructure is revolutionised - with teleporters available for civilian use, gargantuan skyscrapers reaching miles into the sky, and material needs practically eliminated. However, even greater breakthroughs have been occurring in cyberspace, which has supplanted physical reality completely, for many people. Virtual worlds are now of such grandeur and ingenuity that they far surpass anything in the real world. These digital environments run at speeds considerably greater than real time - further accelerating the pace of innovation. From this point onwards, the only way for civilisation to gain more energy is to expand outward into space. Massive colonisation efforts are now underway. Asteroid mining dominates the economy of the inner Solar System, while hydrogen and helium are being siphoned from the gas giants. Meanwhile, terraforming of Mars has passed a critical stage, with bacteria and lichen beginning to appear on the surface. Interstellar travel increases greatly during this time. The settlements on Alpha Centauri, Barnard's Star and Wolf 359 are inhabited by thousands of humans and machines by now, while the most distant exploratory craft have reached over 50 light years from Earth. Faster-than-light travel is proving to be more difficult to achieve than previously thought, however, with maximum velocity still limited to 0.99c.

2257: Eris reaches perihelion. Eris is the second largest dwarf planet in the Solar System. Discovered in 2005 by the Palomar Observatory in California, it was found to be 2,326 km (1,445 mi) in diameter, making it only slightly smaller than Pluto. Although Pluto is larger by volume, Eris has a greater mass, the ninth greatest in the Solar System. Subsequent observations revealed that Eris had a small moon, later named Dysnomia. The albedo of Eris was measured at 0.96, making it brighter than any other solid body in the Solar System except Enceladus. It was speculated that its high albedo was due to surface ices being replenished by temperature fluctuations as Eris's orbit took it closer and farther from the Sun. As a trans-Neptunian object (TNO), and a member of the scattered disk, the orbit of Eris is highly eccentric – ranging from 97.56 astronomical units (AU) at its furthest point, to 37.77 AU during its closest approach to the Sun. It last reached perihelion in 1699 and aphelion in 1977. With the exception of some comets, Eris and Dysnomia were the most distant known natural objects in the Solar System during the early 21st century, at roughly three times the distance to Pluto. Eris has an orbital period of 558 years. It returns to perihelion in 2257. Models of internal heating via radioactive decay suggested that Eris could have an internal ocean of liquid water at the mantle–core boundary. This was confirmed by robotic exploration missions during the mid-21st century. With its relatively closer distance to the Sun, Eris gains importance in the 23rd century – as both a source of water extraction, and a staging post to more remote destinations like the Oort Cloud, Alpha Centauri and beyond.

2260: Accelerated development of the Solar System. With humanity shifting its focus beyond Earth, a veritable gold rush is taking place throughout the Solar System. Countless technological and engineering marvels are now possible in space, aided by the vast growth of AI which is dominating planetary and interplanetary government. On Mercury, colonies roll around the equator on giant train tracks – keeping pace with the planet's rotation so that they are always kept in the "terminator" zone. This ensures that the Sun never rises fully above the horizon, maintaining the optimum temperature and brightness. On Venus, a massive terraforming effort is underway. Automated craft are being sent to water-rich comets, redirecting them into the planet's upper atmosphere, while genetically engineered extremophile bacteria are being seeded on the ground. Due to the much denser and more hostile atmosphere, this process is slower than the efforts on Mars. However, with many people achieving practical immortality, human endeavours are becoming increasingly focussed on the long term. Already, there are citizens buying up land and real estate on the surface in preparation for the centuries and millennia ahead. Earth's own moon is the most heavily populated of all the space colonies. It now has millions of permanent inhabitants. Entire cities have grown up around the original Apollo landing sites. Huge numbers of citizens are involved in the expansion of infrastructure and technological research. In the previous century, Mars had declared its independence from Earth. Its surface is now dotted with large tent cities and criss-crossed with a hyper fast, automated rail network. The terraforming operation is proceeding on schedule, with a number of giant orbital mirrors now in place. Thousands of asteroids are now being mined – both in the main-belt and the Trojan region. Advances in nanotechnology and related fields enable these rocky bodies to be stripped literally atom-by-atom, so that nothing goes to waste. Precious metals are harvested for use in hi-tech industries, while volatiles such as water can be supplied wherever needed. Some of these minor planets are being hollowed out, their interiors converted into enormous power plants, scientific stations, or utopian habitats. Meanwhile, Jupiter is gaining a steady influx of prospectors – led by cyborgs and non-biologicals. These individuals are better adapted to the radiation belts and harsh environmental conditions on Ganymede, Io, Europa et al. Whilst the moons are being exploited for their metals, minerals and water ice, Jupiter itself is being mined for its rich gas resources. A number of floating cities and gigantic refineries are beginning to appear in the upper atmosphere. A similar situation is occurring in the Saturnian system. Its largest moon Titan is now home to entire cities, while Enceladus is showing great potential as a water source for the outer Solar System. The rings of Saturn now have luxurious hotels orbiting around them, offering spectacular views of the planet. Guests can venture outside and drift among the slowly moving ice fragments. Even the distant planets of Uranus and Neptune have gas miners now. Pluto and the Kuiper Belt, too, are being exploited and developed after being fully probed and catalogued in detail. The number of citizens making the journey to neighbouring star systems is growing exponentially during this time.

2280: Microbial life is confirmed on an exoplanet. Antimatter propulsion has allowed much of the local stellar neighbourhood to be explored. By the end of the 23rd century, the first direct confirmation of alien life is obtained. After surveying thousands of worlds, automated probes uncover a unicellular organism within 100 light years of our Solar System. This lifeform reproduces via binary fission. It has an internal chemical structure that is remarkably similar to DNA, suggesting that this form of replication may be quite common throughout the Universe. It exists on a warm, wet planet with a G-type star as its parent. This system includes gas giants in the outer regions, offering some protection from comets and other incoming bodies. The impact of this discovery is less profound than it might have been in previous centuries. Religion has already vanished from many societies, for example. Science and technology in general have advanced so far – and in so many new ways – those previous “wonders” of the Universe are beginning to diminish in significance: even the possibility of alien life. Though still hailed as a landmark discovery, the emotional aspect of this and other breakthroughs has waned considerably, as human thought and endeavour become ever more computerised, logical, and machine-like. Transhuman and robots now make up the bulk of the mainstream population. Efforts are now underway to uncover more advanced and complex life. The potential for contact with intelligent alien races is now being given considerable attention.

2300: Superhuman powers are available to common citizenry. The nanotechnology of recent decades has conferred powers to citizens that would be considered superhuman by 21st century standards. These upgraded "transhuman" could perform feats regarded as Godlike to denizens of earlier times. A suitably upgraded individual - if transported back to the year 2000 - would be impervious to weaponry and nearly impossible to contain. They could morph their body into a seemingly infinite variety of forms depending on the situation encountered. If trapped in a maximum-security prison, for instance, they could alter their own molecular structure, allowing them to walk through walls. They could broadcast electromagnetic pulse waves to disable electronic devices, vehicles and other objects. Bullets and other projectiles would pass through them with no effect. Microscopic cameras, distributed throughout their body, would function as an all-round 360° sensor - covering the entire electromagnetic spectrum and making it impossible for an attacker to surprise them. They could sprint at lightning speed and cross a variety of dangerous terrain types: even molten lava would present no obstacle to them. If necessary, they could levitate from ground level to the roof of a skyscraper in seconds. They could turn themselves invisible, morph into another person entirely, or stretch their limbs like elastic. In many ways, they would resemble a comic book superhero or video game character. They could manipulate their environment in various ways, generating enough body heat to light a fire, for example, or turning inanimate objects into advanced nanotechnology tools, or modifying the properties of liquids. They could heal a wounded person just by touching them. They could read thoughts and emotions or extract recent memories. If standing near others of their kind, they could link and combine their powers to even greater levels – harnessing the power of local weather, for example, or lifting objects weighing thousands of tons. Their sensory capabilities would be phenomenal. This 24th century person could view individual atoms with the naked eye; or if they wanted to, use their telescopic vision to see distant astronomical objects. They could hear a whisper from miles away, or filter specific voices from a cacophony of background noise. They could determine a precise chemical composition just by tasting, touching, or smelling it. Due to their various biotechnology aids and physical upgrades, they would never require sleep. They could even survive without food and water - living instead off the energy of their surrounding environment, which would be absorbed into their photosynthetic, piezoelectric skin. This same external layering would keep them at peak levels of physical performance, as well as shielding them from the elements. In fact, many citizens of today have abandoned their homes altogether and taken to a nomadic lifestyle, for this and other reasons. Often, a "home" of today is little more than a small booth or alcove in the street, where a person can temporarily recharge and recuperate, or utilise the greater powers of the net. Even a person's body is often temporary, as they shift between various real world and digital environments. Much of the Earth is now being transformed into a gigantic computer grid where individuals can physically "plug" themselves in. Not everyone has opted to make this transition. Even now, there are segments of society which are adamant in maintaining a natural, minimally upgraded human body. These people are now a definite minority, however, given the practical immortality and other benefits offered by transhumanism.

2310: Voyager I reaches the Oort Cloud. The Voyager I probe has been travelling through space since 1977. It reached a milestone in 2012 when it departed the heliosphere and began to interact with the interstellar medium. Its next milestone would have to wait another 300 years, however. By 2310, it has reached the inner part of the Oort Cloud – a vast collection of ancient comets and icy planetesimals, forming a gigantic sphere that surrounds our Solar System. This begins at roughly 2000 astronomical units (AU), extending for another 50,000 AU. Its outer edge marks the end of the Sun's gravitational dominance, beyond which lie stars like Alpha Centauri. The probe has long since been passed by faster spacecraft, which have already explored these neighbouring systems. It remains drifting onward as a historical relic, taking 30,000 years to pass through the Oort Cloud.

2333 AD: The Dounreay nuclear site becomes safe. In the mid-20th century, Dounreay on the north coast of Scotland became a site for experimental nuclear technology – including fast breeder reactor prototypes alongside nuclear propulsion for Navy submarines. The 178-acre facility, chosen for its remote location away from large population centres, remained in operation from 1955 until 1994. In 1977, seawater leaked into a shaft packed full of radioactive waste along with sodium and potassium. This caused a catastrophic reaction, blasting off the steel and concrete lids of the shaft and littering the area with radioactive particles. In addition, tens of thousands of fragments of radioactive fuel escaped into the sea between 1963 and 1984 – believed to have washed away from the drainage of cooling ponds. This resulted in a ban on fishing within 2 km of the plant. By 2011, more than 2,300 radioactive particles had been recovered from the sea floor, and 480 from the beaches. In 1998, the UK Atomic Energy Authority proposed a new timetable for decommissioning, with a cost estimated at £4.3 billion. Waste removal and the demolition of buildings began to progress over the subsequent decades. Teams working on the site used robots – including a particularly large, 75-tonne machine nicknamed the "Reactosaurus" – to dismantle parts of the reactor interior. By 2025, all legacy Magnox fuel had been retrieved, and by 2029 all radioactive waste had been removed from the shaft affected by the 1977 incident. However, much still remained to be cleaned up. The site's plutonium would not be repacked until 2060. The last building would not be removed until 2125, while the final "Low Level Waste" would take until 2129 to dispose of. Even then, longer-term radiation effects lingered, keeping the land unsuitable for human use. In 2020, a report from the Nuclear Decommissioning Authority presented a schedule of 313 years for Dounreay to be fully restored. By the year 2333, this decontamination process is complete.

2460 AD: The first Americium batteries are dying. In the first half of the 21st century, Americium began to emerge as a potential new source of electricity for spacecraft. This human-made element was a by-product of the decay of plutonium, produced from nuclear reactors. It featured 148 neutrons and 95 protons/electrons, with a half-life of 432 years. The much greater mission durations provided by long-lasting "space batteries" enabled a new generation of deep space probes to explore the far reaches of the outer Solar System. Lower launch costs, combined with ever more compact and miniaturised components, along with greater autonomy, made it possible to deploy large fleets or networks of tiny spacecraft at tremendous distances. Notable achievements in the 21st century included a partial 3D map of the Kuiper Belt, for example, Neptune and its satellites being explored in more detail, especially the moons' surface environments; along with comprehensive studies of more distant locations such as Pluto, Make-make, Haumea, and objects in the Scattered Disk, including Eris and Sedna. These probes – some drifting for over a trillion kilometres – continued to send back data, for centuries. A few even reached the Oort cloud, providing direct measurements of its composition, spatial extent, gravitational interactions, and other characteristics, as well as returning images of distant objects. By 2460, the first batteries to incorporate Americium are beginning to die, more than 400 years after they first began operating. The Solar System is a very different place in the 25th century, however, with travel times having shrunk substantially. Much faster and more modern craft (with light speed capability) have subsequently located many of these probes – upgrading/replacing them, cannibalising them for other uses, or returning them for historical archiving.

2500 AD: Mars has been terraformed. By the middle of this millennium, the Red Planet has been turned from a cold, dead world into a lush, Eden-like paradise. This monumental achievement was the result of human collaboration on a planet-wide scale. It created whole new industries and undreamed-of technologies. The project was initiated in the early 22nd century. From the outset, it faced major hurdles. Aside from the sheer scale of geo-engineering an entire world, there were political, social, and cultural issues too. Many of the early settlers on Mars actually wished for it to remain in its raw, primeval state. Some had an almost spiritual connection to the place. Like certain environments on Earth, they believed it held an intrinsic worth and unique value that should never be replaced. To better it with manmade artificial processes would somehow make it less natural, less real. These "Reds", as they came to be known, were a potent force during the early Martian government. They were a thorn in the side of planning authorities, who faced not only protests and demonstrations, but direct sabotaging of industrial activity from some of the more extreme individuals. On the opposing side of this debate were the "Greens", initially consisting of mostly corporate interests. They included a greater proportion of cyborgs andtranshuman, who became adapted to the environment of Mars at an earlier stage and were thus able to survive in the lower air pressures. Over time, the power and influence of the Greens began to dominate. Improved security measures were introduced, guarding much of the infrastructure and terraforming equipment from attack. Giant solar mirrors were placed in orbit. Measuring tens of across, these reflected and ocussd the Sun's rays onto the poles. Other projects included the seeding of artificial extremophile bacteria. These began converting CO2 into oxygen. Vast swarms of nanobots were later introduced, accelerating this process, and offering a greater degree of control, since they were fully programmable. They also helped to produce nitrogen, as well as regulating the overall composition of the atmosphere. A series of enormous "heat factories" were constructed. These belched out huge amounts of CO2 – much of it imported from Venus – which was converted by the bacteria and nanobots. This also had the effect of raising air pressure. To increase the volume of water on Mars, comets and ice-rich asteroids were mnoeuvredinto orbit. These were made to slowly burn up as they descended, without impacting on the groundandcausing damage. This still left the problem of Mars' lack of a magnetosphere, which exposed the surface to harsh ultraviolet radiation. Giant superconducting rings were placed around the latitude lines - foussed mainly on the equator - and buried deep below ground. These were thousands of miles in length and took many decades to construct but were sufficient to create an artificial magnetic field. By the 23rd century, frozen lakes and ponds were beginning to form in some regions. This was followed by successful growing of the first lichen and mosses, genetically adapted to withstand a harsher environment. Entire seas and oceans began to appear in the 24th century, along with the first trees and other flora, plus certain arthropods, and insects. A series of chain reactions and positive feedback loops began to accelerate the process, fueled by even greater technological advancements. By the 2400s, a panoply of animals were being introduced including fish, birds, reptiles, and mammals. Eventually, it was declared safe for unaided, fully biological humans to walk on the surface of Mars. Humanity had created a second Earth.

2550 AD: Nitrogen trifluoride emissions from 2000 AD have been naturally reabsorbed. Nitrogen trifluoride (NF3) is a colorless, odurless, man-made greenhouse gas – most frequently used by the electronics industry in the late 20th and early 21st century. It was required mainly for plasma etching, the cleaning of chambers in which silicon chips were made, along with semi-conductor and LCD panel manufacture. A number of other important applications were also found in the photovoltaic and chemical laser industries. Present only in trace amounts, nitrogen trifluoride was considered an environmentally preferable substitute for suphur hexafluoride and fluorocarbons. Nevertheless, it had extremely high potential as a greenhouse gas when measured on a per molecule basis, with 12,300 times the heat-trapping ability of CO2 over a 20-year period and 17,000 times over 100 years. Despite its low concentration, emissions were also growing extremely rapidly, from just 0.02 ppt (parts per trillion) in 1980, to almost 1.0 ppt by the early 2010s, a 50-fold increase. Emission reduction strategies were proposed in subsequent years to limit its use and slow its contribution to global warming. The atmospheric lifetime – defined as the length of time that a molecule resides in the atmosphere before removal by chemical reaction or deposition – was calculated at 550 years. For NF3 emitted in the early 21st century, complete reabsorption by natural means would therefore take until 2550 AD.

2600 AD: Plastics and other waste products are disappearing from Earth's biosphere. Most of the remaining plastics, tin cans, and other waste products from the 20th-21st centuries have decomposed by now. They caused significant harm to the environment during their time on Earth – injuring millions of birds, fish, and other animals. Subsequent generations of these materials were produced in ways that minimised their impact.

2640 AD: The longest-running music piece comes to an end. To an observer from the early 21st century, the world of the 27th century would be largel nrecognisable. By now, large portions of the Earth have been converted into a mixture of computronium and enormous scientific instruments, alongside dazzling architecture and works of art, whose mere appearance would defy the comprehension of an unaided human mind. Despite these intimidating physical and structural changes, many cultural icons from humanity's distant past have been preserved in their original form, even now. Among the historical relics to survive intact – dwarfed by the surrounding, high-tech landscape – is St. urchard Church, in a small region of what used to be known as Germany. Inside this old building is a musical instrument called an organ, one of the few remaining examples in its original form. In the year 1987, American composer John Milton Cage Jr. wrote a piece called Organ²/ASLSP (As Slow as Possible). A performance of the organ version began at St. Burcardi church in Halerstadt Germany, on 5th September 2001. This was scheduled to have a duration of exactly 639 years. The piece started with a silence of 17 months, followed by a first note sounding on 5th February 2003. Various chord changes would occur every year or two, using a total of six pipes, maintained with bellows providing a constant supply of air. By 2030, there had been a total of 21 notes played. On 5th September 2640, the final moments of this song are witnessed by a small gathering of natural, biological humans. Thus ends the longest-running music concert in history. John Cage had written the piece, As Slow as Possible, to make a statement about the fast-paced life most people were leading.

2700 AD: Venus has been terraformed. By now, Venus has been transformed into a habitable, Earth like world. The planet's entire orbit was shifted, bringing it closer to the "Goldilocks Zone", while its day-night cycle was accelerated from 117 days to 24 hours. Comets were redirected from the Oort Cloud and Kuiper Belt. These were guided through the solar system and into the upper atmosphere of Venus, releasing vast quantities of water. Other techniques involved the capture and removal of CO2, achieved by the seeding of nanobots, which absorbed this and other toxic gases and replaced them with breathable oxygen. With lush tropical oceans, Venus now has two large, dominant landmasses - Aphrodite and Ishtar - along with a number of smaller subcontinents and islands. The average surface temperature has stabilised at around 25°C.

2739 AD: the next entity of balance is born. The next entity of balance is said to be called captain Karl and is one of the only fully organic humans, and despite the best efforts of the world, he never inherits any genetic modification, even when injected, he lacks the genes that allow his immune system to attack cancer cells, as was typical for people a long time ago, but this weakness it also completely nullified as his genes don’t mutate, while vaccines still work on him, he is also immune form anything trying to modify his DNA, not even the modifications the genetics of his parents can make him inherit any of the modifications, he also has another struggle later on in life since technology can mimic entity powers, this means that he will struggle fighting a regular person, yet alone a skilled one, another weakness is that his primary element, or the element he learns first is earth, while normally this would be great, but humanity has already turned the entire planet into a supercomputer by transforming everything into computium, while this doesn’t change how the planet works, the huge draw back is that earth can not manipulate computium, this drastically reduces earths power from being able to hit with the force of the entire planet, to just some stone boxing gloves, this also reduces earths defence, from being able to shatter anyone’s fist, to just being a slow thick jacket, at which point anyone can beat earth in a fight of they know what they are doing. However, this change does still affect the other elements, but earth is the worst hit.

This is the furthest I can see into the future, so I have left captain Karl’s details here.

Location: Latitude: -75.71141, Longitude: -108.34320

Date: 21/11/2739

Time: 07:55

Colour pattern: magenta and blue (cyan blue)

Height: 180cm (shorter than normal people at the time)

Eyes: blue

Hair: brown

Home country: Antarctica