It has been a recurring topic on the Architechnologist: there are two impediments that are holding back the flood of new technology that will come from current innovation – batteries and bandwidth. This article is the first in a new series to discuss the present and future of these two necessary facets of our technological future.[/su_note]
The bottleneck that is currently of the Internet is not from big data, the Internet of Things (IoT) or even the tremendous amount of video being streamed, it’s the “pipe” itself. Thanks to all of those wonderful things, users are bumping against the maximum limit of what can move through even the most robust networks.
In the near future, there will be huge leaps ahead that will certainly surpass the capacity of any physical transmission medium – currently available 4K video streams can slow entire networks to a near standstill. Further, as augmented and virtual reality content become part of the content that is readily available to the typical user, there will need to be a new solution that will expand the amount of data that will be able to be broadcast over conventional routes.
A recent discussion with recognized expert in data compression, Oliver Christie, focused on the probability that a software-driven solution to the limitations of the capability of the transmission medium is likely the only acceptable solution. Much like the fictional algorithms in HBO’s “Silicon Valley” series, Christie foresees that only massive and loss-less compression can provide a means of transmitting the massive amounts of data in the content that is being created.
AR and VR data is tremendous – bigger than 3D movie files, no matter what the Weissman score is.
Christie further proposed that there is an even larger issue at hand even if a method of massive compression was achieved: in situations where the information relayed through the massive content files itself is critical, zero data loss is the highest priority. For example, super-high resolution 3D medical scans would be (are) so large that transmitting them could be impossible without compression – any loss of detail in those files could lead to mistakes in diagnosis or treatment.
Today, two-thirds of used bandwidth is standard definition video… in the near future it seems inevitable that larger files will take up an even greater percentage. Solutions like those envisioned by HBO’s fictional “Pied Piper” or the real-world’s Oliver Christie, will likely be our first, best answer.
This post is cross-published on the Architechnologist, a site dedicated to exploring technologies that change the way we experience the world around us. For the stories behind the content – information that often drives upcoming news or the first glimmers of the next generation of ideas, please accept a free trial of a curated weekly newsmagazine, the Curated Architechnologist, by clicking here.