As a dedicated Google partner, the Google Next event is always a highlight for us and this year is no exception. We love hearing leaders in the Google Cloud space share their insights into the new things that are available to help CTS solve their customers’ challenges and can’t wait to see the impact these changes and updates will make. Read on to discover what sparked our team’s excitement from this year’s event…
After a poorly planned (but extremely enjoyable) holiday during Google Next last week, I had a lot to catch up on. The formal page lists 123 announcements, and that’s without the sub-section for Workspace specifically! There is so much to go at, making it really hard to pick just one. Amongst the many highlights, we saw plenty of data changes. Specifically, BigQuery has been tweaked and will process unstructured and streaming data (nice!) like audio and video, or PDFs. I’m a big fan of Looker, which is being nicely folded into the data offering, with Data Studio being rebranded and Looker working nicely inside Sheets.
And, while we’re here, I’ll add another highlight that’s rather on topic. Whilst I was away, I used Google Lens a lot to translate menus and the odd sign via my phone (classic Brit!). Translation Hub is bringing that same ML technology and putting more of an enterprise wrapper around it, instead of a consumer one. Really exciting for strengthening global communications for businesses everywhere.
My highlight was the focus on advancements in artificial intelligence (AI). During the pandemic 50% of organisations said that AI had accelerated their plans, for example Frontier Lab in partnership with NASA has created the first 360 degree view of the sun! With Google Cloud being the default choice for data, AI and machine learning, coupled with an open approach which lets you access data from multiple clouds AND the launch of new sustainability services (such as the launch of Carbon Footprint and GA of Google Cloud Earth Engine), now is the perfect time to cook up a new business initiative with CTS and Google Cloud!
The highlight for me was Google’s vision for the Digital Workplace of the future – introducing a new era of communication, expression and co-creation powered by AI advancements. The sheer volume of Workspace announcements are a testament to this vision. To meaningfully create together was a strong theme – Translation Hub (135 languages in seconds!), Smart Canvas and Immersive Connections, expansion of Smart Chips, and Appsheet in Meet – each stood out alongside a classic Google demo that showcased the wow factor for co-creation and meeting equity.
Alongside the increasing adoption of open source software, in recent years we’ve seen a huge increase in security incidents exploiting vulnerabilities in software supply chains – i.e. all the processes that go into building and deploying applications on the cloud. Google is an acknowledged leader in the field of cloud native security, so I was delighted to see a number of its offerings brought together in a single, cohesive offering with the announcement of Software Delivery Shield. These secure software supply chain (S3C) offerings gives developers an opinionated and curated path to production, from real-time security feedback as they work in their IDEs (Cloud Workstations, Cloud Code), curated open source artefacts with Assured OSS, non-falsifiable attestations and provenance (SLSA level 3-compliant) for CI with Cloud Build), managed CD for container platforms (Cloud Deploy) and providing enhanced security posture feedback with dashboards for GKE Standard and Autopilot.
=> TL;DR – developer security from keyboard to runtimes with S3C!!
Sundar Pichai talked about Project Starline in the Google Next keynote and I’ve had the opportunity to attend a demo of Project Starline in person. I was amazed by the impact of it. It feels like the other person is in the room with you, the detail of the image is that good. Project Starline is Google’s technology to render a 3D image of a person and display that in 3D to you in the room, without the need for any VR glasses, I may add. It will absolutely change the way we experience virtual meetings. During the demo the Googler that I was speaking to in “real-life” held up her hand so I could high five it and it felt like I could touch her hand. You can make eye contact with each other and that moves the goal post for virtual communication in the future. All made possible by Google’s machine learning, computer vision and spatial audio services. This is what can be achieved when combining Google technologies.
In a previous life, I used to work for a startup building a PostgreSQL plugin, and the wise old veteran of the database world is still close to my heart. That’s why I’m super excited to see the steps Google is taking with AlloyDB. For those that need a top-tier transactional database, or who’s analytical journey hasn’t yet led them to BigQuery, AlloyDB allows them to take the PG interface they know and move it out to cloud scale. At Next it was announced more partners are coming onboard to AlloyDB, meaning a wide range of familiar applications integrate invisibly with AlloyDB.
For those same PostgreSQL users, Spanner now integrates PGSQL directly via Java or Go. And for the cherry on the Next cake, Spanner now integrates directly into Vertex AI.
Piecing all this magic together; classic Postgres can now directly feed into machine learning models, where every step of the process is performant, and scalable. Only Google could pull that off.
Interested in finding out how Google Cloud can transform your business? Get in touch with our team today.