When it comes to the breakthroughs that brilliant scientists and engineers are working on in 2018, artificial intelligence technology somehow manages to be both the most promising and most polarizing development of these times.
As a collective, Big Tech is throwing billions of dollars at artificial intelligence, which those involved would rather we all call machine learning. The notion that we can teach computers to learn — to absorb data, recognize patterns, and take action — could have an enormous impact on nearly everything we do with a computer, and pave the way for computers to move into new and game-changing places, such as the self-driving car.
This technology still has a long way to go, despite the fact we’ve been talking about it for decades. But it’s starting to become real, and alongside that progress has come perhaps one of the biggest backlashes against an aspect of the evolution of information technology.
With all that in mind, we invited five experts in artificial intelligence to speak at our GeekWire Cloud Tech Summit in June. Their talks ranged from big-picture visions of what AI can do as well as concerns about its impact, to specific ways that companies are implementing machine learning to solve problems everyone can get behind.
This is going to be an important topic for years to come as companies, governments, and citizens debate the parameters of how AI will be used in our lives. Within the following five videos, you’ll hear informed wisdom about the current state of this market as well as some practical tips for implementing AI within your own company’s products or services.
Emily Fox, Amazon Professor of Machine Learning, UW
AI relies on data, but things start to get tricky when that data is tightly coupled to the time when it was produced. Fox showed how AI can be applied to time-series data in order to unlock new insights about how our world is changing over the years, which could have a huge impact on forecasting and prediction models.
David Tennenhouse, Chief Research Officer, VMware
Tennehouse opened the morning AI track with a wide-ranging discussion about the potential future of artificial intelligence, examining historic trends and modern data to paint a picture of AI’s potential for our world. What does seem clear is that a fair amount of jobs that have so far survived the information technology revolution will succumb to advanced AI systems, and as a society we need to start preparing for that future, he said.
Sophie Lebrecht, Director of Operations at xnor.ai
Edge computing was a big topic at the GeekWire Cloud Tech Summit across main stage presentations from the likes of Microsoft Azure CTO Mark Russinovich, as well as our track on serverless computing, a technology ideally suited for the edge. AI can play a role in helping edge devices operate autonomously without having to rely on a spotty connection to the cloud, and Lebrecht walked attendees through specific ways in which edge computing applications can take advantage of AI without the chip horsepower required for an awful lot of AI applications.
Jay Bartot, CTO, Madrona Venture Labs
Despite all the gloom and doom that often accompanies AI, there are lots of places, like health care, where it can make a huge difference. But in an era where tech companies are paying AI talent eye-popping salaries to develop AI models, making sure organizations with limited resources can put this technology to use is becoming extremely important, and Bartot talked about how simple pre-written AI models can be used with data sets to expand the reach of AI technology.
Paige Bailey, Sr. Cloud Developer Advocate, Microsoft
Bailey took a broad view of how artificial intelligence can be applied to real-world workloads, such as how banks can use artificial intelligence to help make decisions about how they offer loans. Like many big cloud tech companies, Microsoft has made substantial investments in cloud services that bring artificial intelligence expertise to application developers who don’t have the time or inclination to learn how to create advanced machine-learning models.