ChannelLife New Zealand - Industry insider news for technology resellers
Story image
Gartner’s top 10 tech trends for 2019
Tue, 16th Oct 2018
FYI, this story is more than a year old

Gartner has highlighted the top strategic technology trends that organisations need to explore in 2019.

“The Intelligent Digital Mesh has been a consistent theme for the past two years and continues as a major driver through 2019,” says Gartner Fellow and vice president David Cearly.

“For example, AI in the form of automated things and augmented intelligence is being used together with IoT, edge computing and digital twins to deliver highly integrated smart spaces. This combinatorial effect of multiple trends coalescing to produce new opportunities and drive new disruption is a hallmark of the Gartner top 10 strategic technology trends for 2019.

The top 10 strategic technology trends for 2019

1. Autonomous things

Autonomous things, such as robots, drones and autonomous vehicles, use AI to automate functions previously performed by humans.

Their automation goes beyond the automation provided by rigid programing models and they exploit AI to deliver advanced behaviors that interact more naturally with their surroundings and with people.

“As autonomous things proliferate, we expect a shift from stand-alone intelligent things to a swarm of collaborative intelligent things, with multiple devices working together, either independently of people or with human input,” says Cearley.

2. Augmented analytics

Augmented analytics focuses on a specific area of augmented intelligence, using machine learning to transform how analytics content is developed, consumed and shared.

Automated insights from augmented analytics will also be embedded in enterprise applications to optimise the decisions and actions of all employees within their context.

“Through 2020, the number of citizen data scientists will grow five times faster than the number of expert data scientists,” says Cearly.

“Organisations can use citizen data scientists to fill the data science and machine learning talent gap caused by the shortage and high cost of data scientists.

3. AI-driven development

The market is rapidly shifting from an approach in which professional data scientists must partner with application developers to create most AI-enhanced solutions to a model in which the professional developer can operate alone using predefined models delivered as a service.

By 2022, at least 40% of new application development projects will have AI co-developers on their team.

“Ultimately, highly advanced AI-powered development environments automating both functional and nonfunctional aspects of applications will give rise to a new age of the ‘citizen application developer' where non-professionals will be able to use AI-driven tools to automatically generate new solutions,”  says Cearley.

4. Digital Twins

A digital twin refers to the digital representation of a real-world entity or system.

By 2020, Gartner estimates there will be more than 20 billion connected sensors and endpoints and digital twins will exist for potentially billions of things. Organisations will implement digital twins simply at first. They will evolve them over time, improving their ability to collect and visualise the right data, apply the right analytics and rules, and respond effectively to business objectives.

“One aspect of the digital twin evolution that moves beyond IoT will be enterprises implementing digital twins of their organisations (DTOs),” says Cearley.

“DTOs help drive efficiencies in business processes, as well as create more flexible, dynamic and responsive processes that can potentially react to changing conditions automatically.

5. Empowered Edge

The edge refers to endpoint devices used by people or embedded in the world around us.

Cloud computing and edge computing will evolve as complementary models with cloud services being managed as a centralised service executing not only on centralised servers, but in distributed servers on-premises and on the edge devices themselves.

Over the next five years, specialised AI chips, along with greater processing power, storage and other advanced capabilities, will be added to a wider array of edge devices.

Longer term, as 5G matures, the expanding edge computing environment will have more robust communication back to centralised services.

6. Immersive Experience

Conversational platforms are changing the way in which people interact with the digital world. VR, AR and mixed reality (MR) are changing the way in which people perceive the digital world.

“Over time, we will shift from thinking about individual devices and fragmented user interface (UI) technologies to a multichannel and multimodal experience,” says Cearly.

“This multi-experience environment will create an ambient experience in which the spaces that surround us define ‘the computer' rather than the individual devices. In effect, the environment is the computer.

7. Blockchain

Blockchain, a type of distributed ledger, promises to reshape industries by enabling trust, providing transparency and reducing friction across business ecosystems potentially lowering costs, reducing transaction settlement times and improving cash flow.

Today, trust is placed in banks, clearinghouses, governments and many other institutions as central authorities with the ‘single version of the truth' maintained securely in their databases. The centralised trust model adds delays and friction costs (commissions, fees and the time value of money) to transactions. Blockchain provides an alternative trust mode and removes the need for central authorities in arbitrating transactions.

”Current blockchain technologies and concepts are immature, poorly understood and unproven in mission-critical, at-scale business operations,” says Cearly.

“Despite the challenges, the significant potential for disruption means CIOs and IT leaders should begin evaluating blockchain, even if they don't aggressively adopt the technologies in the next few years.

8. Smart spaces

Multiple elements come together in a smart space to create a more immersive, interactive and automated experience for a target set of people and industry scenarios.

“This trend has been coalescing for some time around elements such as smart cities, digital workplaces, smart homes and connected factories,” says Cearly.

“We believe the market is entering a period of accelerated delivery of robust smart spaces with technology becoming an integral part of our daily lives.

9. Digital ethics and privacy

People are increasingly concerned about how their personal information is being used by organisations in both the public and private sector, and the backlash will only increase for organisations that are not proactively addressing these concerns.

“Any discussion on privacy must be grounded in the broader topic of digital ethics and the trust of your customers, constituents and employees,” says Cearly.

“While privacy and security are foundational components in building trust, trust is actually about more than just these components. Trust is the acceptance of the truth of a statement without evidence or investigation. Ultimately an organisation's position on privacy must be driven by its broader position on ethics and trust. Shifting from privacy to ethics moves the conversation beyond ‘are we compliant' toward ‘are we doing the right thing'.

10. Quantum computing

Quantum computing (QC) is a type of non-classical computing that operates on the quantum state of subatomic particles that represent information as elements denoted as quantum bits (qubits).

The parallel execution and exponential scalability of quantum computers mean they excel with problems too complex for a traditional approach or where traditional algorithms would take too long to find a solution.

“CIOs and IT leaders should start planning for QC by increasing understanding and how it can apply to real-world business problems,” says Cearly.

“Learn while the technology is still in the emerging state. Identify real-world problems where QC has potential and consider the possible impact on security. But don't believe the hype that it will revolutionise things in the next few years. Most organisations should learn about and monitor QC through 2022 and perhaps exploit it from 2023 or 2025.