White Paper
In 2019, innovative technologies will drive revolutionary changes in both the ways we do business and run our homes.
Computing levels beyond comprehension, the rise of decentralized computing, and much higher scrutiny of online privacy are among the contenders that will revolutionize our lives. All signs point to exciting developments in:
The definition of AI-driven development is exactly as it sounds: development processes that are supported by the guidance of artificial intelligence (AI). This trend in automating development tasks has the potential to transform the software industry by taking repetitive, low-level work and transferring it to intelligent machines.
Software development as it exists today is a slow, labor-intensive process. It’s gotten faster with the adoption of agile workflows, but there’s still a lot of time-consuming tasks that are managed by humans. According to InfoWorld, software developers spend nearly 10 hours per week waiting for either software builds or tests to complete and managing their development environments.
In many organizations, developers expect not to catch every bug in testing. They look to users and customers to identify and report problems after launch. This approach has unfortunate impacts: according to devops testing company Tricentis, there were 548 recorded software fails in 2016 that impacted $1.1 trillion in assets across 4.4 billion people.
AI-driven development aims to change the narrative by applying AI to the heavy load of repetitive tasks, supporting developers, and speeding up time-to-market. According to Gartner, AI-driven development processes will be in use in nearly 40% of new application projects by 2022.
Here are just a few examples of AI-driven development in practice today:
There will be a long ramp-up as knowledge diffuses through the developer community, but in ten years I predict most software jobs won’t involve programming.”
– Pete Warden, Google Research Engineer
Quantum computing, which (for many of us) still seems like science fiction, is getting closer to real-world application. However, the biggest hurdle beyond a functioning quantum computer may be understanding exactly what quantum computing is, how it works, and what types of of problems it might solve.
The best analogy is to imagine a huge library filled with books. A traditional computer can read each book in the library, one after the other. A quantum computer can read all the books at the same time. With several hundred qubits (or quantum bits), a quantum computer could perform more concurrent calculations than the number of atoms in the entire universe!
Quantum computing could enable pharmaceutical companies to simulate human reactions to drugs, optimize flight paths across the U.S. to reduce delays, or create stronger security protocols to not just defend, but also anticipate cyber threats. And the race is on as companies across the globe work to build a stable, functioning quantum computer. IBM, Microsoft, Google, and Intel are all contenders in the U.S., and the E.U. and China are investing heavily in quantum computing research and development.
However, the most challenging step in the near future is building and maintenance. Quantum computers are plagued by constant errors, as delicate qubits degrade quickly and can be affected by the slightest change in temperature or vibration. And the more qubits you introduce, the harder it is to maintain the computer and reduce errors.
One real-world example that shows promise is Volkswagen’s partnership with D-Wave Systems, a quantum computer manufacturer, to reduce traffic congestion in Beijing. Quantum computing was applied to billions of data inputs on 10,000+ taxis to generate traffic solutions in seconds.
A big factor in the success of this endeavor was remote access to the quantum computer. Volkswagen’s engineers accessed it through the cloud, instead of having to support a delicate quantum computer on-site. Making quantum computers accessible will be the key to future success, as Goldman Sachs predicts quantum computing will be a $29 billion industry by 2021.
To understand edge computing, it helps to think about the massive shifts that have taken place over the last couple of decades. Computers started out as on-premise tools that had software stored on individual hard drives or servers. As internet usage grew, connections between computers increased, opening up the ability to access remote software in the cloud.
Fast forward to today and decentralized computing is making a comeback. There’s a massive amount of computing power in all the remote gadgets we access, like smart thermostats, fitness trackers, and virtual voice assistants. These devices are only increasing: the number of Internet of Things (IoT) devices has grown to 7 billion in use as of mid-2018.
In many cases, a device’s function is only possible with real-time processing. It takes too long for an autonomous car to take in all the inputs related to driving, send them to the cloud, and wait for a response. The autonomous car must know whether to slow down, stop, or go — immediately.
Edge computing takes cloud processing and moves it to the source (aptly named “the edge”). This means there are a series of sensors (likely RFID tags) that take in all types of data — road conditions, wind speed, cars present, traffic lights, and pedestrians. Those RFID tags pass the data they gather to an intelligent edge device in the car for real-time processing.
There are plentiful use cases for moving processing from the cloud to the edge. Shipping companies that had zero visibility can now track ships, containers, and even individual products. Hospitals can offer telemedicine services to patients in rural areas, mimicking an in-person visit. Hudson Yard, the biggest private real estate project in U.S. history, will likely teach us a great deal on how edge computing can support city infrastructure.
Though edge computing will rise in usage with a projected $6.72 billion market value by 2022, don’t expect the cloud to disappear. It still has an important function in optimizing the edge. The autonomous car doesn’t need the cloud to make real-time decisions. However, relevant data can stream to the cloud when traffic is low. Analysis in the cloud can take place at any time and transmit new programming for optimization.
The phrase “digital twin” may be the most unfamiliar on our list, but the concept behind it has been around since NASA planned initial explorations into space. In order to manage space missions, troubleshoot problems, and deliver solutions to astronauts — without being there — NASA engineers built systems that mirrored the spacecraft back on earth.
This idea of a functioning copy of a machine persisted, and future iterations became known as digital twins. Michael Grieves of the University of Michigan was the first to write about digital twin technology in 2002. At the time, technical limitations kept digital twin from having a real-world application.
Faster internet connections, higher bandwidth on smaller computer chips, the growth of cloud and edge computing, and the expansion of artificial intelligence evolved to make digital twin functionally possible.
And analysts are bullish on its prospects — take a look at these projections:
The importance of using digital twin technology lies in the need to observe a machine operating from afar. Historically, engineers building anything from jet engines to cars to manufacturing machinery were limited by their ability to simulate using a model. With a digital twin, sensors can be equipped to capture and observe many data inputs in a native environment. And just like NASA, the engineer can ingest the data from anywhere in the world.
Digital twin technology works for many use cases that can also be supported by edge computing. GE built a digital twin of a wind farm to learn from the performance of existing wind turbines and improve new construction. Agriculture will likely be an area of growth for digital twin technology, as it can provide remote crop, soil, and livestock monitoring.
“Digital twins are becoming a business imperative, covering the entire lifecycle of an asset or process and forming the foundation for connected products and services,” said SAP Senior Vice President of IoT Thomas Kaiser. “Companies that fail to respond will be left behind.”
Expect a variety of diverse uses of digital twin over the next five years, as the digital twin market is projected to grow to $15.66 billion by 2023.
Blockchain earns a place in our technology trends report for the second year in a row. The ingenious, peer-verified global ledger is still dominating the headlines, with one major difference: businesses across the globe are shifting from exploring blockchain as a technology to practical applications in current use cases.
Deloitte’s 2018 Global Blockchain Survey found the following fascinating results:
Given the survey results, it might seem like blockchain will take the world by storm very shortly. However, the same survey found that of U.S. participants, only 14 percent stated that any instance of blockchain was in production for their business.
The reason that blockchain expectations are high but US adoption is low may be the concept of “blockchain fatigue”. Many feel that blockchain is being touted as a holy grail of sorts that can solve almost any problem, but practical examples of blockchain in action (aside from the ubiquitous Bitcoin) are sparse.
However, diverse applications of blockchain are starting to pop up more frequently. Both the UN World Food Programme and the Finnish government are working to solve the problem of digital identity verification for refugees using blockchain. And, the Australian Securities Exchange made history by becoming the first major exchange to adopt blockchain technology.
“Blockchain will change the way we do business and interact with companies,” said Eric Brown, Founder and CEO of Aliant Payment Systems. “It will become commonly used for everyday transactions.”
Even though opinions are mixed on the viability of blockchain, businesses are expected to invest. The total blockchain market will have an estimated value of $2.31 billion by 2021, according to research site Statista.
With the speed of technological innovation today, previously unimaginable machines are becoming reality. But there are other factors to consider when developing technology: concerns about digital ethics and privacy.
In 2016, political data company Cambridge Analytica accessed private records of roughly 50 million Facebook users — the majority of whom did not provide consent. Recordings from voice assistants Amazon Echo and Google Home are being requested in U.S. courts as evidence in murder trials.
As time passed, there were warning signals about digital ethics and privacy. Edward Snowden leaked classified data from the NSA in 2013, unveiling major government surveillance programs. Media coverage of the massive Target and Equifax data breaches have raised visibility as well.
According to writer Natasha Lomas, “The key issue is about the abuse of trust that has been an inherent and seemingly foundational principle of the application of far too much cutting edge technology up to now.” However, Gartner predicts that the status quo may be changing.
One key piece of the puzzle is more strict government regulation of how businesses protect consumer data privacy. The European Union’s General Data Protection Regulation (GDPR) aims to empower individuals by restricting what companies can do with their data. GDPR applies to any business that processes data on an EU citizen — which means that companies all over the world have to comply.
Key tenets of the regulation, which took effect on May 25, 2018, are:
Organizations are required to comply with GDPR or face lawsuits from data subjects and fines up to 4% of yearly turnover or 20 million euros (whichever is larger). The fallout has already begun — Facebook and Google were hit with lawsuits on the first day GDPR took effect, followed by Nielsen, British Airways, Carphone Warehouse, and Ticketmaster.
With more public scrutiny and financial consequences, the tide is turning. As of a June 2018 survey by Dimensional Research, 20% of companies surveyed believed they were GDPR compliant, while 53% were still working on the compliance process. The California legislature passed its own version of the European Union’s GDPR in June of 2018, with experts projecting other U.S. states to follow suit.
Artificial intelligence has the potential to drastically change the work of software developers (for the better). Researchers will continue to experiment with ways to stabilize quantum computers to harness their massive computing power.
Edge computing will leverage the advantages of remote, decentralized computing to make real-time decisions. Digital twin technology will empower engineers to learn from devices in the field and thus optimize tools and processes.
Blockchain will continue its evolution from a powerful (yet unexplored) technology to a tested tool in the arsenal of many businesses. And due to both legislation and pressure from consumers, companies will make increased efforts to make ethically sound decisions and protect the digital privacy of users.
We’re excited to see how these technology trends play out in 2019. Stay tuned…
Schedule a live demo with a product expert to see how GoSpotCheck’s field execution management platform can transform your business.
Schedule a DemoRead our case studies or white papers to learn more about how GoSpotCheck enables field teams to optimize for effective data collection and reporting.
A library of case studies with real-world examples of how our customers leverage GoSpotCheck.
Read On ›Our extensive library of white papers has information and guidance tailored to your industry.
Start Exploring ›