By 2030, a new cellular network standard has emerged that offers even greater speeds than 5G. Early research on this sixth generation (6G) had started during the late 2010s when China,* the USA* and other countries investigated the potential for working at higher frequencies.
Whereas the first four mobile generations tended to operate at between several hundred or several thousand megahertz, 5G had expanded this range into the tens of thousands and hundreds of thousands. A revolutionary technology at the time, it allowed vastly improved bandwidth and lower latency. However, it was not without its problems, as exponentially growing demand for wireless data transfer put ever-increasing pressure on service providers, while even shorter latencies were required for certain specialist and emerging applications.*
This led to development of 6G, based on frequencies ranging from 100 GHz to 1 THz and beyond. A ten-fold boost in data transfer rates would mean users enjoying terabits per second (Tbit/s). Furthermore, improved network stability and latency – achieved with AI and machine learning algorithms – could be combined with even greater geographical coverage. The Internet of Things, already well-established during the 2020s, now had the potential to grow by further orders of magnitude and connect not billions, but trillions of objects.
Following a decade of research and testing, widespread adoption of 6G occurs in the 2030s. However, wireless telecommunications are now reaching a plateau in terms of progress, as it becomes extremely difficult to extend beyond the terahertz range.* These limits are eventually overcome, but require wholly new approaches and fundamental breakthroughs in physics. The idea of a seventh standard (7G) is also placed in doubt by several emerging technologies that support the existing wireless communications, making future advances iterative, rather than generational.*