Notes from #TheAIConf in San Francisco, CA, Sept 2018

Amitabha
5 min readSep 8, 2018

I decided to attend The AI Conference (#TheAIConf) in San Francisco, CA, as a participant; mainly because of good feedback that some prior attendees had given me, and because of O’Reilly’s reputation of hosting conferences having a good blend between deep technical, semi technical, and business orinted talks.

The AI Conference lived up to my expectations, and was very valuable and helpful for a novice like myself, who is trying to understand the different aspects of AI. The conference also stayed focussed on deep learning. The keynotes featured several star casts such as Kai-Fu Lee of Sinovation Ventures, David Patterson and Dawn Song of UC Berkeley, and Peter Norvig of Google. Some key takeaways:

-

Prominently visible companies were Intel (A co-presentor), Google, AWS, and Microsoft. Google had continuous sponsored sessions going on, so one ballroom was pretty much dedicated to them, with speakers covering all aspects of Tensorflow, CoLab, Tensor2Tensor, Tensor Processing Units (TPUs), etc. Here is a picture from teh Google Cloud booth.

It is clear that all cloud providers see AI/Machine Learning (ML)/ Deep Learning (DL) as a prominent use case and business driver. Many companies seem to be developing their AI models on top of AWS (Sagemaker), Google Cloud, or Microsoft Azure. As an example, Joseph Sirosh (@josephsirosh), CTO, AI, Microsoft had a moving example of how cloud backed AI can be used to devise cheap prosthetics that can revolutionize the lives of people. nVidia surprisingly had zero presence; may be the nVidia brand has become ubiquitous with AI, so nVidia doesn’t need to spend marketing dollars on such events.

Focus on Hardware: There seemed to be an important focus on HW, a lot of focus on on edge devices to run inference workloads. Intel seems to be betting its AI play on that faster memory technologies will be required for deep learning, and that Intel Optane + Intel QLC (see here) will give significant memory boost to power deep learning. Intel’s bet in also that the large range of processors that it builds, will be applicable in many edge environments for inference workloads (see Morningstar analyst Abhinav Davuluri’s report on this topic here). In a later keynote, Huma Abidi (@humaabidi), Director, Machine Learning, Intel disclosed how Intel researchers are attempting to improve the software hardware interface to improve performance of AI workloads. David Patterson from UC Berkeley talked about the RISC-V open source instruction set that will be used by many hardware vendors. Many presentations focussed on hardware for low powered/edge systems such as those from Mythic, uTensor, and Cerebras Systems

China is a huge market for AI: Driven by huge mobile consumption (3x), large number of mobile payments (50x), large number of bicycle rides (300x), ubiquity of cameras in traffic intersections and streets, China is the most important market for AI (All numbers are relative to the USA). It is expected that 48% of AI money gies to China, while 36% to USA, and China will be 7 trillion dollar market by 2025. The Chinese government is doing multiple things to facilitate AI growth, such as, facilitating autonomous truck testing on highways built with sensors to drive. Cities are building special lanes for self driving cars. While China may lag behind in the quality of research vis-a-vis the USA, it will be way ahead on data collection. Kai-Fu Lee of Sinovation Ventures described how China’s software companies have evolved over the generations; starting from copying US models to make them work in China (e.g., Alibaba.com) to leapfrogging US technologies and thus having China proven technologies work in USA. He described how Chinese entrepreneurs are brutal in competition, and work 997 (9am to 9pm, 7 days a week). China and US are big twin drivers of AI, and will definitely be twin superpowers of AI technology. You can read more in his book: AI SuperPowers, coming out on September 25.

Lot of talks on the pitfalls of AI: AI is data reliant, and therefore suffers bias and dependency on past history. Issues such as facial recognition techniques being 30% less efficient for colored women show a need for countering the inherent bias that any data driven technique has. AI has not shown to be good in creativity and compassion; however AI can be used to do the trivial tasks that consume a significant time of life for humans, leaving humans to do the creative work. Dawn Song(@dawnsongtweets) focussed on security aspects of UI, showing examples of how hackers can introduce bias by targeting the input data set.

Incumbent companies have huge advantage because of being large data warehouses. Only 7 companies in the world can really afford to do large-scale AI: that is they have computation power, have access to massive data, and can pay the millions of dollars a year to AI scientists. That makes these companies extremely powerful, and humans could be vulnerable to their power. Data sharing (like Kaggle) might level the field, but most datasets on Kaggle are good for programming contests as opposed to building real world use cases.

On the technology point of view, a lot of talks focussed on Reinforcement Learning, Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM), Auto-encoders, Time-Series Analysis and Forecasting, as well as applications of ML/DL in the fields of robotics, self-driving cars, sports (baseball), storage systems, finance (investments and trading), chatbots, space, earth science, to name a few. There really are a lot of interesting talks, some of whose slidedeck is already available here. O’Reilly would be uploading and giving access to all talks via the Safari outlet,the anticipated release date for all videos is in 4 weeks.

Well Attended: Sure, AI is a hot area, but this conference was as well organized and attended as it could be, kudos to O’Reilly for doing a fantastic job. Here is a picture from the exhibit area.

The author wrote this in his personal capacity, and the views written here are not associated with his employer or any organization he is associated with. You may reach the author at Twitter(@hiamitabha)

--

--

Amitabha

Avid biker. VMware engineer. Robotics. Thoughts in this forum reflect my own opinions. Write about Robotics, Vector, Cozmo, and VMware.