Keeping AI out of the Clouds: An Interview with Edy Liongosari at Accenture

By Carl Ford August 02, 2018

I had the pleasure to connect with Edy Liongosari, Chief Research Scientist, Accenture, a few days ago to ask some questions about how Accenture manages the their technology engagements. Edy is speaking at Fog World Congress October 1st in San Francisco.

The questions I asked were focused on how Fog Computing was enabling the changing business models associated with Digital Transformation.

Here is what he had to say.

Carl:  You have expertise in looking at Outcome based business models.  How do you see IoT impacting business models?

Edy: IoT has been touted as a key enabler for many non-conventional business models. It gives rise to intelligent connected products and services today. With real-time insight collected, companies can further develop products and services that are highly adaptive and predictive to the customers’ needs in real-time. All of these allow companies to develop a wide-range of business models -- from pay-per-use, freemium, subscription-based to resource sharing -- that are too hard or expensive to do before.

That applies to outcome-based business model as well. It is essentially driven by product-to-service transformation, which focuses on the outcomes the customers aim to achieve. Instead of selling MRI machines, medical equipment companies can sell the health outcome of an MRI scan. Instead of selling cars, automotive companies can sell highly customized and convenient transportation services. This allows the companies to think broadly what it can provide to customers beyond selling a set of products. These services do not need to be confined to those that the companies can provide in-house. The model encourages them to build ecosystems of partnership to collaboratively deliver larger and higher value outcomes. 

Carl:  The concept of Fog Computing is relatively new.  Is there a first market or application where you see the architecture being deployed?

Edy: While the term Fog Computing is relatively new, its broad concept along with the network topology it represents is not.  Fog Computing – which sits somewhere between Edge and Cloud Computing -- plays a key role in significantly increasing the flexibility, reliability and security of the overall infrastructure for Industrial IoT where Edge and Cloud computing are unable to do by themselves.

We see Fog Computing provides the most benefit in complex environments such as oil pipelines in remote locations or offshore oil drilling platforms. In these environments, the connectivity to the cloud tends to be unreliable and expensive. The edge devices are often quite simple due to a variety of limitations such as cost and energy consumption. Fog becomes the essential connectivity tissue between cloud and edge devices.

Carl:  Accenture is known for its ability to integrate complex systems.  How does Fog simply your implementation strategies?

Edy: Fog Computing Reference Architecture is one of the key parts of Fog Computing as it provides a common model and guidance for communicating, designing, implementing and managing Fog Computing. It is especially useful to those who are relatively new to this space to quickly reach a common ground on wide range of issues – from what Fog Computing is, its key functions, to critical aspects to consider in implementation.

The common reference architecture makes it easier to reuse and plug-in existing off-the-shelf components  – both hardware and software – across multiple vendors and it provides a map of how various components fit together. This significantly accelerates the implementation phase and yields a more robust solution when done right. It also allows us to easily create a roadmap and priority for various functionality: the essential vs. the optional that can be implemented later. 

Carl: You are speaking at Fog World Congress.  Can you give us a brief description of what people will learn?

Edy: Scalable Intelligence –  How Cloud, Fog and Edge work in-concert to provide high scalable and agile AI.  The ability to deploy machine learning models dynamically across all these three layers and how to manage large amount of complex models are key. I will describe specific examples to show how this capability was developed and the deployment results.  I will also discuss where the future might lie and what it takes to realize this future.

Coming to hear him in person is a smart idea.

Edited by Ken Briodagh

Partner, Crossfire Media

Related Articles

Zenlayer, Zadara Livin' on the Edge with Cloud Storage Services

By: Maurice Nagle    9/23/2021

This week, Zenlayer and Zadara unveiled a partnership to provide fully-managed cloud storage services at the edge. North American Zenlayer customers c…

Read More

IDC Anticipates Serious Growth in Managed Edge Services Over Next Five Years

By: Luke Bellos    8/23/2021

A new forecast model created by IDC expects to see major growth for managed edge services within the next five years.

Read More

Will All Enterprise Networks Be LTE/5G by 2030? Some Experts Think So

By: Matthew Vulpis    6/14/2021

Unlike previous generations of network technology that paved the way for innovations like smartphones and wireless broadband, 5G's tremendous improvem…

Read More

Advancing the Orchestration of Distributed Edge Applications, ZEDEDA Integrates with Microsoft Azure IoT

By: Arti Loftus    2/24/2021

ZEDEDA's recently introduced orchestration solution for the distributed edge provides a unique, native integration with Azure IoT, giving developers a…

Read More

Raising the Bar on Edge Computing, ZEDEDA Introduces Industry's First Open Orchestration Solution for the Distributed Edge

By: Arti Loftus    1/29/2021

We are officially in the Infrastructure-as-a-Service (IaaS) world, with the value of evolving ecosystems growing. Proprietary orchestration solutions …

Read More