Saturday, November 16, 2024
No menu items!
HomeRESEARCHTech Trends 2023 Prologue - A brief history of the future

Tech Trends 2023 Prologue – A brief history of the future

From Deloitte: Tech Trends 2023: The technology forces shaping tomorrow Learn how pioneering organizations adopt innovations while continuing to operate seamlessly as they grow...

Several years ago, at a demo day at Silicon Valley’s Computer History Museum, I came face to face with the history of the future. At the time, I was a venture capitalist on the hunt for the next big thing. During a break from startup pitches from the best and brightest entrepreneurs, I wandered among the museum exhibits, where I stumbled upon a modern recreation of the first computer, designed in the 1840s by English polymath Charles Babbage.

I was fascinated to read about Babbage’s Victorian-era designs, particularly his analytical engine, a mechanical general-purpose computer that he worked on with fellow mathematician Ada Lovelace. The analytical engine shared many features with modern digital computers, including three key components: the reader, the mill, and the store. The reader took in punch cards, permitting user interaction with the machine. The store held information—numbers and interim results—until they could be acted upon by the mill, which performed mathematical computations.

Babbage couldn’t have known then that these three fundamental functions would still exist today, serving as the enduring foundation of modern computing. In fact, as we demonstrated in a joint research report with the World Economic Forum, the entire history of IT has been a steady evolution of these same three eternities: interaction, information, and computation.1 In turn, it stands to reason that the future of IT will continue to march along these same three tracks toward specific, convergent endgames: simplicity, intelligence, and abundance (figure 1).

Interaction: Toward simplicity

Electronic, digital general-purpose computers appeared about 100 years after Babbage’s design. Room-sized computers weighed tons and were programmed with punch cards, but within three decades, users interacted with desk-sized computers using the command-line interface.

By the 1990s, desktop-sized computers boasted graphical user interfaces, and simple iconography replaced arcane computer syntax. Later, point-and-click evolved to touch-and-swipe on portable computers carried in pockets and worn on wrists, and to virtual assistants that can understand voice commands. Today, extended reality can take us to immersive 3D universes where our digital doppelgangers interact and engage in virtual experiences.

What’s next for interaction?

The technologies that power human-computer interaction get more complex, but user experiences get simpler.

So what’s simpler still? Ambient experiences, in which ubiquitous digital assistants monitor the environment, awaiting a voice, gesture, or glance, reacting to (or proactively anticipating) and fulfilling our requests. And beyond that? Neural interfaces that afford direct communication between biological thought and digital response. Today’s smart thermostats accept voice control; tomorrow’s will know you feel chilly and proactively adjust to ensure your comfort. Researchers are already exploring how neural interfaces might help people with certain disabilities use brain signals to control external devices.

Information: Toward intelligence

When Babbage designed his analytical engine, information meant numbers and, later, mathematical operations. Over time, arithmetical calculations gave way to relational databases of clearly defined and structured data. By the aughts, databases became advanced enough to manage unstructured data such as text, audio, and video. This structured and unstructured data could, in turn, be mined for patterns and trends. So began the era of descriptive analytics.

The last decade or so saw the rise of predictive analytics: what we can expect to happen based on observed patterns and trends. Today, cognitive automation systems combine predictive analytics with algorithms and AI to make useful data-driven decisions in real time.

What’s next for information?

As our information systems continue to advance, machine intelligence itself will become increasingly well rounded.

Computer scientist Larry Tesler once quipped, “Artificial intelligence is whatever hasn’t been done yet.”2The future of AI, then, might be broadly defined as exponential intelligence: a progression up the curve of capabilities that have, to date, seemed “uniquely human.”

Affective AI—empathic emotional intelligence—will result in machines with personality and charm. We’ll eventually be able to train mechanical minds with uniquely human data—the smile on a face, the twinkle in an eye, the pause in a voice—and teach them to discern and emulate human emotions. Or consider generative AI: creative intelligence that can write poetry, paint a picture, or score a soundtrack.

After that, we may see the rise of general-purpose AI: intelligence that has evolved from simple math to polymath. Today’s AI is capable of single-tasking, good at playing chess or driving cars but unable to do both. General-purpose AI stands to deliver versatile systems that can learn and imitate a collection of previously uniquely human traits.

Computation: Toward abundance

Computation turns inputs into outputs. From mill to mainframe to minicomputer to client server, advances in computation were a story of miniaturization: Moore’s law and the relentless march towards better, faster, cheaper, and stronger. In practice, that changed over the decades with advances in virtualization, culminating in modern cloud architectures. Computing became a distributed utility, promising elasticity, flexibility, and possibility to those embracing it.

Today, the shift to the cloud has, in turn, given further rise to decentralization—technologies and platforms rooted in the cryptographically secure blockchain. Decentralization recognizes that millions of processors, disks, and resources sit idle for much of the time, and that they can be marshaled as resources. Decentralized storage, compute, domain name system (DNS)—and yes, currencies—spread the work and thetrust across a community of network participants, demonstrating that none of us is as capable, or as trustworthy, as all of us.

What’s next for computation?

As computers continue to miniaturize, virtualize, and decentralize, our capacity to process data, create and curate content, develop and code, and solve problems is on an unstoppable march toward abundance.

Fueled by decentralized networks, edge computing, and advanced connectivity, the spatial web is likely to blur the lines between physical and virtual environments. As reality itself increasingly comes online, digital content will be seamlessly woven into our physical spaces, inseparable from our shared personal and professional experiences. And waiting in the wings? Quantum computing—going beyond bits entirely, and harnessing the quirky laws of quantum mechanics to speedily solve previously intractable problems with physics rather than mathematics.

Tech Trends 2023: Eyes to the skies, feet firmly on the ground

Futurists don’t have crystal balls. Instead, we subscribe to the notion that “the future is already here, albeit unevenly distributed.” Our Tech Trends team has spent the better part of 14 years looking across all sectors and geographies for glimpses of pioneering leaders building distinct facets of the future, today. Fully half of the trends that we’ve chronicled fit into the three enduring categories of interaction, information, and computation described above.

But why only half?

Startups often embrace the mantra “move fast and break things.” It’s easier for them to be disruptive because they’re definitionally starting from zero and don’t yet have a legacy to protect. Established organizations, on the other hand, very much do. Successful businesses realize they can’t risk breaking “now” in pursuit of “new.” Our responsibility is to balance our pioneering inclinations with the solemn duty of stewardship; to do no harm, the Hippocratic oath of IT. Responsible enterprise professionals must nurture what they have now as they seek to navigate to what’s next.

To this end, we further chronicle emerging trends in three additional categories—the business of technology, cyber and trust, and core modernization—to acknowledge the reality that business drives technology, not the other way around, and that extant systems and investments need to play nicely with pioneering innovations so that businesses can seamlessly operate while they grow.

Taken together, we call these the six macro technology forces of information technology (figure 2).

We’ve arrived at this year’s trends through both primary research and lived experience, interviewing both industry and public sector leaders who have developed innovations in everything from resilient manufacturing and data repatriation to digital and biometric credentialing. Their input helped us shape the six trends chronicled in Tech Trends 2023.

As we prepare for launch, I’d encourage a moment of perspective-cum-humility. Futurists are secretly historians. And as Mark Twain reportedly said, “History doesn’t repeat itself, but it often rhymes.”3 Having worked in all things newfangled for 25 years, I’ve seen literally thousands of self-styled “world-changing technologies,” but nonethat have marked “the end of history.” It’s a sobering thought to realize that today’s white-hot innovations will indeed become tomorrow’s legacy applications—that our pioneering advances might one day be dismissed by the new generation as “the old way.” This is not meant to depress, but to embolden. It might be said that success for us as makers is building something significant and sustainable enough that our successors take notice and flag it for further modernization. Our job, dear reader and fellow leader, is not to hubristically chase “future-proof,” but to humbly target “future-friendly.”

Onward,

Source – https://www2.deloitte.com/us/en/insights/focus/tech-trends.html

NEWS

TRENDS

COMMENTS