Google I/O 2017 Keynote – What marketers think

– VPS | Android Go | TensorFlow & TensorFlow Lite | Google Lens | Google home | Google Assistant 

I have asked a few marketing experts, futurologists and digital gurus for their thoughts on this, but as its something people want to know about now – I thought I would publish my thoughts and then add to it later, for now we have Arnout, the awesome Dawn and my own (Gerry) random thoughts on it…

VPS (Visual Positioning System)

GPS has been around for a while now, and frankly I still find it incredible it works using satellites, VPS uses cameras and known object locations to help you navigate in areas where GPS isn’t possible, in store for example – combining this with Nearby technology, this is how ecommerce and brick and mortar will combine to become that joined experience, this data will allow superstores & hard to navigate indoor experiences become trackable interactive experiences. I have said before that local, nearby low energy Bluetooth and technology such as this are going to become commonplace in places like IKEA, the best marketers will soon be taking advantage of this and using it to ensure that users are marketed to in exceptionally sophisticated ways.

 

Dawn: VPS (visual positioning system) is also a potentially fantastic breakthrough from an accessibility perspective for those visually impaired, and otherwise unable to experience the reality of navigating their way easily through the aisles of stores and indoor venues.  Of course, given this technology is leaning on the growing field of machine learning it’s difficult to say how quickly this will become mainstream enough for brands to implement well enough to become a part of core consumer experience offerings.  Like anything else it likely will be unsophisticated initially and improve over time, but the possibilities with this are jaw-dropping game-changers for big retail.  Huge bonus is if the likes of Ikea introduce this type of technology the days of having to follow the yellow lined road through their indoor warehouse maze may be optional, albeit meatballs won’t be included.

 

SEO Arnout Hellemans

 

Arnout: This is quite huge as Dawn says especially for the visually impaired. But also for retail, couple this with all the data points Google is collecting (and intent data) and we will soon be able to retarget in real life (and in store) ;-). In store conversions anyone, or buying reminders? Just combine this with all data points like product reviews, I can see some really interesting possibilities. I’m excited and a bit scared at the same time.

 

Android Go

Android, particularly is rammed with bloatware, often that runs in the background and some of the worst culprits are Google’s own software! Google recognise that many users will be on low value, low bandwith devices that need to perform well, this is Google targeting them. From a marketers point of view, particularly where there are apps – opening apps like Uber will massively benefit many users, democratising access.

 

Dawn: We also know that in some of the more developing regions worldwide the vast majority of internet users are on mobile devices and anything that lightens the bandwidth load on devices has got to be applauded.  Furthermore, given the increasing number of apps gaining popularity in the marketplace, all looking to take a portion of available device storage and data consumption, it’s going to be key to ensure apps become more lightweight so there’s actually room for users on limited devices to add new apps and they don’t have to make too many choices NOT to include new apps.  There’s nothing more frustrating than finding there’s simply not enough space left to include an app you want to make a part of your everyday life and having to decide whether the new app has enough value appeal to make the decision to delete something else.  Let’s hope many of the large brands developing apps jump on board with the new developer resources available at Google Developers – “Building for Billions” and make creating lightweight apps part of their ongoing agile practices.

 

Arnout: This for me is the big battle for the users in Africa and the other less developed countries, by slimming the Android OS it can run on far cheaper devices and that way giving Google a market entry in these markets. Some of these countries have almost completely skipped fixed internet and went 3- or 4g to start with. This is a huge potential market for us as marketeers, so even more opportunities.

TensorFlow & TensorFlow Lite

 

The future here is unimaginable, seriously, I can not actually imagine what we are going to see in the coming years – we are actually entering a world with machines evolving by developing their own offspring. All I am going to say on this, please hardcode in the 3 rules of robotics now!

Tensorflow Googles machine learning, artificial intelligence playground, the light version is integrated into your phone – I don’t know where this is taking me but Microsoft developed a learning bot that turned evil in a couple of days…

 

Dawn: I’m really excited about this Open Source machine learning resource.  Particularly the announcement at Google I/O about the new TensorFlow Research Cloud, which makes available to researchers undertaking computationally-intensive machine learning projects a cloud cluster of 1,000 TPUs free of charge.  We know the power of open source when developers and researchers around the world jump on board.  The developments here are going to be quite outstanding given the engagement likely to follow the introduction of the TensorFlow Research Cloud by educational institutions looking to actively build incredible things in areas such as healthcare, but currently facing limited resources.  The great news is it’s also not just limited to academia, but any organisation, or researcher with a project they want to use the resource for can apply to use the Research Cloud.

 

Arnout: I’d have to agree with Dawn on this, and to be honest, for some projects we are already moving this way. There is so much that can be done and I think it is awesome Google open sourced this.

Google Lens

 

Point at a shop, and get the website, point at  .. .well anything and it learns more about it. You can see the power of this already by putting random keywords into your Google Photos account, my personal demo is Ant eater, it recognised what can only be described as bad photos of an ant eater. It recognises many different objects and descriptions. What it means for marketers? Structured objects will help users interact with you from the offline world to the online world – it will recognise an object and lead you to the place to purchase it.  My favourite example is making it easier to get your wifi password by photographing it (which we all do anyway don’t we?)

 

Dawn: The days of typing words into a device (mobile or desktop for that matter) appear to be increasingly numbered.  Google’s new AI driving Google Lens is yet another example of this.  We all know the spoken word in search is targetted by the likes of Google Home, Alexa, Cortana and Siri, but now Google are making inroads into the world of visual search, with, amongst other developments, Google Lens.   We point our camera at our preferences to find out more about whatever that is… and Google Lens tells us more.  It’s a great way for them to increasingly understand our preferences without waiting for us to type words into a search box and build upon individual user data for potential future targetting and refinement.  Don’t forget with all of this, there is limited space to display ads on mobile devices so all of this means increasing necessity to build upon a one to one exchange of value between users and search engines so that the recommendations in the future are as accurate as possible.  You know the things I point my camera lens at; you know me increasingly; you know my likes; you target me with exactly those products / places / product types I want to see in the future.  Given our obsession with phone cameras it was only a matter of time before search engines overall turned their focus toward ways in which to take a portion of that consumer attention via interception.

 

Arnout: This visual search power will and can be huge, though i suspect it is still converts into a text based search. This gives us as search marketeers loads of new possibilities, but also for Google with their shopping products and a way to monetize on their image search results. But as a user I can definitely see the huge benefit it gives us over a regular search (especially when you are unsure how to call some part / object). Oh and think of the possible ways this could fit into 3D printing?

With Google Lens, your smartphone camera won’t just see what you see, but will also understand what you see to help you take action. #io17 pic.twitter.com/viOmWFjqk1

 

Google home

 

This is the Google competitor to Alexa, but integrates incredibly with Chromecasts, and other home devices, it feels like this is something that Amazon should have done with Fire – as this improves we are going to see our interactions being user centric rather than device, an interaction can start on a google home, with a tv via chromecast before being moved over to your phone or tablet. This with the Google assistant will make an ecommerce shopping experience, particularly for visual products – creative marketers will be able to incorporate a very different multi-device experience.  

 

Dawn: Hands-free wireless calling direct from Google Home is a huge leap forward announced at Google i/O.  You’ll now be able to instruct your Home device to call someone’s voice merely by speaking to it.  This new feature and some of the integrations with assistant means the lag behind Amazon in product for product comparison superiority looks to have been virtually eliminated with some of the new developments for Home announced this week at Mountain View.   I’m certainly looking forward to being able to use the new recipe assistant announced and to integrating it increasingly with Google Assistant.  Interestingly, I was asked in an interview only a couple of weeks ago whether I thought that a statistic someone had thrown around about 70% of search being voice driven by the end of 2018 was viable.  Whilst this may be overly optimistic, the growing uptake of devices such as Home and their integration with other assistive search products might make this figure far from unrealistic in the not too distant future.  Certainly brands need to be taking a new look at ways in which they can optimise their strategies to cater for imminent dramatic change in the way that audiences access search.

 

Arnout: This is a huge step into the living room (the next battlefield with the likes of Amazon Echo, Microsoft Cortana etc.), just think on the data that can be had and the attribution models that can be fed by these voice data points. For me it still feels a bit weird but future generations will indeed not have this problem.

Excited to be a part of #io17. Check out all that’s coming to #GoogleHome: https://t.co/Q6lZJJzOsa pic.twitter.com/j6cCbvSYht

Google Assistant

 

This is the evolution of Google Now, think Siri but smarter, you can purchase things and book taxis without leaving the app – if you haven’t already tried it you can book and use an Uber without leaving Google maps and the end to end experience is seamless and actually very simple. Soon you will be able to order pizzas through it and more.

 

Dawn: It’s all very much ‘toward an audience of one’ from an original ‘audience of many’ and this is clearly the intent proclaimed on the Google Assistant main landing page.  Intelligent customer relationship management on steroids.  I recall Google’s Gary Illyes keynoting at Pubcon back in 2015 and introducing the notion of ‘assistive search’.  The presentation looked at emerging generations’ characteristics and the short attention spans of humans given the over-connectedness created by multi-devicing behaviour and our addiction to mobile phones.  The presentation included a short video clip of ‘Her’; a film about the relationship built between the leading character and his AI personal assistant system.  At the time it seemed almost surreal, but now I’m intrigued to see if, over time, as Google Assistant plays an increasing role in our day-to-day lives via more features and integration, consumers begin to depend on this product.  The ‘Assistant’ of the future will likely know us far better than we know ourselves.

 

Arnout: This is enormously important for any search marketeer (or any online marketeer for that matter). Basically making Google help our decisions by looking and predicting what we will do next, and basically skipping any serp but just answering our need (whether informational or transactional). So basically eliminating the middle man. Exciting times ahead.

Dawn Anderson

Digital Marketing & International SEO Consultant – CEO and Founder Move It Marketing.  Dawn is one of those life long learners who is passionate about search, marketing, innovation and accessibility who I have had the frequent pleasure of here talking, and based on that I am confident her students (she is a lecturer) are going to be the next generation of cutting edge digital marketers – you should definitely hear her talk at Search Leeds, or at a Northern Take It Offline event this year or next!

Dawn Anderson SEO

Arnout Hellemans

Arnout is a digital marketeer into search (paid as well as organic) living in Amsterdam. He works at OnlineMarkethink as a consultant and at the 2care4kids group as the head of Growth. He loves chatting the future and has the tinfoil hat on at most conversations. Big into Android and the future of UX and the future of search.SEO Arnout Hellemans

 

Leave a Reply

Your email address will not be published. Required fields are marked *

  • Peter Mead says:

    Thanks for your perspective on this Dawn. I am particularly interested in open source machine learning since I think it will be important that the smaller players are not powerless as the conglomerates march ahead and dominate further.