Menu

EDGE FEATURE NEWS

Keeping AI out of the Clouds: An Interview with Edy Liongosari at Accenture

By Carl Ford August 02, 2018

I had the pleasure to connect with Edy Liongosari, Chief Research Scientist, Accenture, a few days ago to ask some questions about how Accenture manages the their technology engagements. Edy is speaking at Fog World Congress October 1st in San Francisco.

The questions I asked were focused on how Fog Computing was enabling the changing business models associated with Digital Transformation.

Here is what he had to say.

Carl:  You have expertise in looking at Outcome based business models.  How do you see IoT impacting business models?

Edy: IoT has been touted as a key enabler for many non-conventional business models. It gives rise to intelligent connected products and services today. With real-time insight collected, companies can further develop products and services that are highly adaptive and predictive to the customers’ needs in real-time. All of these allow companies to develop a wide-range of business models -- from pay-per-use, freemium, subscription-based to resource sharing -- that are too hard or expensive to do before.

That applies to outcome-based business model as well. It is essentially driven by product-to-service transformation, which focuses on the outcomes the customers aim to achieve. Instead of selling MRI machines, medical equipment companies can sell the health outcome of an MRI scan. Instead of selling cars, automotive companies can sell highly customized and convenient transportation services. This allows the companies to think broadly what it can provide to customers beyond selling a set of products. These services do not need to be confined to those that the companies can provide in-house. The model encourages them to build ecosystems of partnership to collaboratively deliver larger and higher value outcomes. 

Carl:  The concept of Fog Computing is relatively new.  Is there a first market or application where you see the architecture being deployed?

Edy: While the term Fog Computing is relatively new, its broad concept along with the network topology it represents is not.  Fog Computing – which sits somewhere between Edge and Cloud Computing -- plays a key role in significantly increasing the flexibility, reliability and security of the overall infrastructure for Industrial IoT where Edge and Cloud computing are unable to do by themselves.

We see Fog Computing provides the most benefit in complex environments such as oil pipelines in remote locations or offshore oil drilling platforms. In these environments, the connectivity to the cloud tends to be unreliable and expensive. The edge devices are often quite simple due to a variety of limitations such as cost and energy consumption. Fog becomes the essential connectivity tissue between cloud and edge devices.

Carl:  Accenture is known for its ability to integrate complex systems.  How does Fog simply your implementation strategies?

Edy: Fog Computing Reference Architecture is one of the key parts of Fog Computing as it provides a common model and guidance for communicating, designing, implementing and managing Fog Computing. It is especially useful to those who are relatively new to this space to quickly reach a common ground on wide range of issues – from what Fog Computing is, its key functions, to critical aspects to consider in implementation.

The common reference architecture makes it easier to reuse and plug-in existing off-the-shelf components  – both hardware and software – across multiple vendors and it provides a map of how various components fit together. This significantly accelerates the implementation phase and yields a more robust solution when done right. It also allows us to easily create a roadmap and priority for various functionality: the essential vs. the optional that can be implemented later. 

Carl: You are speaking at Fog World Congress.  Can you give us a brief description of what people will learn?

Edy: Scalable Intelligence –  How Cloud, Fog and Edge work in-concert to provide high scalable and agile AI.  The ability to deploy machine learning models dynamically across all these three layers and how to manage large amount of complex models are key. I will describe specific examples to show how this capability was developed and the deployment results.  I will also discuss where the future might lie and what it takes to realize this future.

Coming to hear him in person is a smart idea.




Edited by Ken Briodagh
Get stories like this delivered straight to your inbox. [Free eNews Subscription]

Partner, Crossfire Media

SHARE THIS ARTICLE
Related Articles

ZEDEDA Certified Edge Computing Associate Certification to Support Growing Uses of Edge Computing

By: Alex Passett    9/6/2023

The new ZCEA certification from ZEDEDA is available through the company's Edge Academy and provides fundamental knowledge about the many benefits of e…

Read More

T-Mobile and Google Cloud Partner to Advance 5G and Edge Compute Possibilities

By: Alex Passett    6/15/2023

T-Mobile and Google Cloud are helping customers embrace next-gen 5G use cases; applications like AR/VR experiences, for example.

Read More

Aptiv PLC Acquires Wind River Systems to Enhance Software-Defined Vehicles

By: Alex Passett    1/5/2023

Dublin-based automotive technology supplier Aptiv PLC has acquired California-based cloud software and intelligent edge company Wind River Systems.

Read More

Driver Safety and Costs Keep Decision Makers Awake

By: Greg Tavarez    12/15/2022

The two things that are top of mind for SMB fleets are driver safety and financial concerns.

Read More

Tomahawk Hosts Microsoft Azure SDK on KxM Body-Worn Edge Processor

By: Stefania Viscusi    11/10/2022

Tomahawk Robotics, a provider of common control solutions, has successfully hosted Microsoft Azure SDK on its KxM edge device.

Read More