Keeping AI out of the Clouds: An Interview with Edy Liongosari at Accenture

By Carl Ford August 02, 2018

I had the pleasure to connect with Edy Liongosari, Chief Research Scientist, Accenture, a few days ago to ask some questions about how Accenture manages the their technology engagements. Edy is speaking at Fog World Congress October 1st in San Francisco.

The questions I asked were focused on how Fog Computing was enabling the changing business models associated with Digital Transformation.

Here is what he had to say.

Carl:  You have expertise in looking at Outcome based business models.  How do you see IoT impacting business models?

Edy: IoT has been touted as a key enabler for many non-conventional business models. It gives rise to intelligent connected products and services today. With real-time insight collected, companies can further develop products and services that are highly adaptive and predictive to the customers’ needs in real-time. All of these allow companies to develop a wide-range of business models -- from pay-per-use, freemium, subscription-based to resource sharing -- that are too hard or expensive to do before.

That applies to outcome-based business model as well. It is essentially driven by product-to-service transformation, which focuses on the outcomes the customers aim to achieve. Instead of selling MRI machines, medical equipment companies can sell the health outcome of an MRI scan. Instead of selling cars, automotive companies can sell highly customized and convenient transportation services. This allows the companies to think broadly what it can provide to customers beyond selling a set of products. These services do not need to be confined to those that the companies can provide in-house. The model encourages them to build ecosystems of partnership to collaboratively deliver larger and higher value outcomes. 

Carl:  The concept of Fog Computing is relatively new.  Is there a first market or application where you see the architecture being deployed?

Edy: While the term Fog Computing is relatively new, its broad concept along with the network topology it represents is not.  Fog Computing – which sits somewhere between Edge and Cloud Computing -- plays a key role in significantly increasing the flexibility, reliability and security of the overall infrastructure for Industrial IoT where Edge and Cloud computing are unable to do by themselves.

We see Fog Computing provides the most benefit in complex environments such as oil pipelines in remote locations or offshore oil drilling platforms. In these environments, the connectivity to the cloud tends to be unreliable and expensive. The edge devices are often quite simple due to a variety of limitations such as cost and energy consumption. Fog becomes the essential connectivity tissue between cloud and edge devices.

Carl:  Accenture is known for its ability to integrate complex systems.  How does Fog simply your implementation strategies?

Edy: Fog Computing Reference Architecture is one of the key parts of Fog Computing as it provides a common model and guidance for communicating, designing, implementing and managing Fog Computing. It is especially useful to those who are relatively new to this space to quickly reach a common ground on wide range of issues – from what Fog Computing is, its key functions, to critical aspects to consider in implementation.

The common reference architecture makes it easier to reuse and plug-in existing off-the-shelf components  – both hardware and software – across multiple vendors and it provides a map of how various components fit together. This significantly accelerates the implementation phase and yields a more robust solution when done right. It also allows us to easily create a roadmap and priority for various functionality: the essential vs. the optional that can be implemented later. 

Carl: You are speaking at Fog World Congress.  Can you give us a brief description of what people will learn?

Edy: Scalable Intelligence –  How Cloud, Fog and Edge work in-concert to provide high scalable and agile AI.  The ability to deploy machine learning models dynamically across all these three layers and how to manage large amount of complex models are key. I will describe specific examples to show how this capability was developed and the deployment results.  I will also discuss where the future might lie and what it takes to realize this future.

Coming to hear him in person is a smart idea.

Edited by Ken Briodagh

Partner, Crossfire Media

Related Articles

Schneider Electric Launches New Services to Manage Distributed IT

By: Ken Briodagh    6/2/2020

Schneider Electric has launched a suite of Monitoring & Dispatch Services to enable IT solution providers and end users to optimize resources, improve…

Read More

Object Management Group Forms Digital Twin Consortium

By: Ken Briodagh    6/2/2020

Founding companies include Ansys, Dell Technologies, Lendlease, and Microsoft

Read More

EdgeMicro Receives $5 Million for Data Center Expansion

By: Ken Briodagh    5/19/2020

EdgeMicro, an edge co-location company, has secured $5 million in investments, reportedly to accelerate its expansion. Also, new hires expand ops team…

Read More

Bringing It All Together: LF Edge Ecosystem and Membership Continue to Grow

By: Arti Loftus    5/13/2020

Getting the IoT and Industrial IoT to scale has not been easy but by harmonizing across many different domains, the LF Edge is bringing disciplines to…

Read More

LF Edge Offers Akraino Edge 2 for IoT

By: Chrissie Cluney    4/30/2020

LF Edge has announced the availability of its Akraino Edge Stack Release 2, or Akraino R2.

Read More