Whether you are starting with a green field of devices, networks, and applications or are adding connectivity to an existing sensor network, the transmission of data is the rationale for the connectivity. As the number of sensors and devices increases, more data comes in, as it can be added to with financial, mobile, and social information that can be structured and/or unstructured. The roll up is called big data. Making that data into something useful for humans is the role of analytics.
The central idea of IoT analytics is that data can be queried to gather context, or it can be monitored as a stream in real time.
It’s no coincidence that in the last few years Internet traffic has tripled, and the number of things in comparison to smartphones and tablets is also tripling. While transmission and connectivity can be expensive, the economics of data storage has changed to the point where continual storage is possible.
Dale Skeen, the CTO and co-founder of Vitria Technology, points out that real-time analysis is the critical aspect of analytics. In his keynote presentation at the M2M Evolution event (now called IoT Evolution) in Miami, he showed that early detection of patterns could have significant reduction in disruptions.
Meanwhile, Mobeen Khan, executive director of product marketing management at AT&T, has pointed out that “some data just needs to be read and thrown away. Determining what type of data is important, what should be transmitted immediately, what should be stored and for how long, and what information should be discarded [is important]. Otherwise, you could end up with an almost infinite pile of data to analyze, when only a relatively small portion is of real importance.”
From a consumer’s perspective we can benefit from apps that control thermostats or interact with our car diagnostics. When it comes to analytics, enterprises are the beneficiaries of predicative and prescriptive analysis.
Enterprise architectures are evolving to manage decisions at the appropriate point. While the cloud and analytics as a service are bound to take some of the spotlight, processing at the edge has some very practical advantages. As with storage processors at the edge, this is cheap, which has given rise to innovative solutions that are quick to deploy. The result is that analytics can be developed and distributed to the appropriate point.
Analytics has the goal of finding meaningful patterns in the data. From those patterns analytics provide statistical results to quantify performance, anomalies, and forecast operational issues (i.e., maintenance, security, etc.). Many of the systems are built with data visualization tools.
Like most of the web, marketing has been a key focus of analytics solutions to date and as a result we have many companies in the business of delivering meaningful pattern detection in consumer buying patterns and loyalty strategies. IoT analytics as a field was part of many a platform company’s solutions, including the likes of Axeda and ILS. Other companies such as Splunk and Vitria are specifically focused on analytics.
The acquisition by Hitachi of Pentaho for something near $500 million is probably a sign of things to come. These days a good place to watch for new startups is in the open source community. We have lots of activity in the open source community with Apache and Eclipse leading efforts to develop standards. Since much of the effort is on real-time data collection, transport has been part of the focus including protocols such as CoAP and MQTT.
Data visualization is the place where companies try to distinguish themselves from their competitors. However, visualization is a double-edged sword since it often makes assumptions as to units of measure and visual objects based on historical requirements within specific vertical markets. I recommend then for our readers to visit all the members of The Hot List to help them think creatively about their particular markets and where the data can be optimized. It also may be worthwhile to speak with the web developers and marketing personnel currently using web analytics, since they have been discerning how to use additional (and sometimes third-party) big data resources to corroborate patterns from anomalies.
According to Cisco’s John Chambers, the bulk of analytics technology was designed to deal with data generated within a business’ firewalls and analyzed in the data center. The reality is that the enterprise itself is more mobile; much of the analytics will be discerned from device data outside an organization’s firewalls and social networks.
Alex Brisbourne, CEO of KORE Wireless Group, points out that up until now, analytics based on IoT data have been primarily descriptive (answering questions about what has happened), somewhat predictive (describing what’s going to happen), and just a touch prescriptive (recommending what to do about it).
However, with all this thought leadership comes the reality check. According to KPMG, a staggering 96 percent of leading global companies struggle to accurately analyze and interpret their data, and 69 percent consider data and analytics to be crucially or very important to their current growth plans. A further 56 percent say they changed their business strategy to meet the challenges of big data. Other studies have shown that when it comes to IoT, the expectation is that the biggest impact is on the people and the process not the data or the things.
It may be that we are just on the verge of understanding the data since Cisco executives have said that by 2020, there will be more than 50 billion connected devices and sensors, generating huge amounts of data, and that over the next 10 years, the Internet of Everything will be worth $19 trillion in new business and cost savings to organizations worldwide.
Of that, $7.3 trillion will come from analytics. A recent Cisco survey found that 40 percent of respondents said their inability to interpret data was the biggest challenge to creating actionable information from the data.
Tom Fountain, the CTO of Pneuron Corp., points out that analytics will impact network architectures. “Without a re-thinking of the processing paradigms of the past, today’s enterprise will be ill-prepared to deal with this change…. An agile, cloud-enabled and intelligent data and analytics fabric is needed for enterprises to successfully address this new normal.”
Splunk’s Brian Gilmore has pointed out that there is machine data that contains a definitive record of all the activity and behavior of your customers, users, transactions, applications, servers, networks, and mobile devices. And it’s more than just logs. It includes configurations, data from APIs, message queues, change events, the output of diagnostic commands, call detail records, and sensor data from industrial systems and more.
Machine data comes in an array of unpredictable formats, and the traditional set of monitoring and analysis tools were not designed for the variety, velocity, volume, or variability of this data. A new approach, one specifically architected for this unique class of data, is required to quickly diagnose service problems, detect sophisticated security threats, understand the health and performance of remote equipment, and demonstrate compliance. Some of this data can be effectively managed and monitored on the streams or at the edge.
Ultimately though companies are looking for operational intelligence to provide a real-time understanding of what’s happening across IT systems and technology infrastructure so people can make informed decisions that add revenue, bring efficiencies, and comply with regulations. The bottom line with analytics is that you are probably going to be constantly finding new insights and opportunities as you explore and expand your data gathering.
Here is The Hot List to help you in those efforts.
To achieve the immense business benefits afforded by the IoT, you need a highly robust and secure network infrastructure. Cisco can help you converge unrelated networks, scale to meet increasing traffic demands, employ advanced data analytics, and inspire a new class of intelligent applications to increase productivity without sacrificing security.
Datawatch provides the only platform for visual analytics to leverage any data at any speed – delivering valuable insights for improving business. The unique ability to acquire, prepare, and transform data from structured and multi-structured sources such as PDF and log files, as well as real-time streaming data, into visually rich analytic applications allows users to dynamically discover key factors that impact any operational aspect of their business. This ability to perform visual discovery against any data at any speed sets Datawatch apart in the big data and visualization markets.
Google’s mission is to organize the world’s information and make it universally accessible and useful.
As much as 90 percent of all data generated by devices such as smartphones, tablets, connected vehicles, and appliances is never analyzed or acted on. Learn how you can derive deep business insight from the Internet of Things – an integrated fabric of devices, data, connections, processes and people – with IBM.
The deviceWISE platform provides seamless and secure integration with the cloud for remote control and monitoring of business operations and equipment by company personnel or authorized third parties via web-based and mobile applications and dashboards. Cloud-to-cloud integration lets companies improve operational efficiencies and create business innovation around collaboration, predictive maintenance, and big data analytics.
Grab exactly the data you need. Event data can be anything – signups, upgrades, impressions, purchases, errors, shares. The arbitrary JSON format we use makes it easy to grab exactly the data you’re looking for, with all of the custom properties you’ve been dreaming of. Event data is big data. That’s why we build a massively scalable, super-resilient event data backend. Send us your worst. We can handle it.
Oracle Data Integrator for big data enables customers to quickly go from data to decisions, streamline their Hadoop development, and enhance data transparency and data governance across the organization. It provides customers with access to an increased number of diverse data types from on-premises and cloud sources, and helps deliver increased performance for growing data volumes, and enrich data quality for business decisions and regulatory compliance.
Azure Stream Analytics is an IoT data stream and event processing engine that provides real-time analytics on large amounts of data coming from things like devices, sensors, infrastructure, and applications and data.
Motomic helps manufacturers leverage the coming Internet of Things to understand how their customers use their products. Once they understand product usage, Motomic helps turn those products into touch points that engage customers.
Delivering the future of business analytics, Pentaho has an open source heritage that drives innovation in a modern, integrated, embeddable platform built for the future of analytics, including diverse and big data requirements. Powerful business analytics are made easy with Pentaho’s cost-effective suite for data access, visualization, integration, analysis and mining.
Whatever you’re doing today, Pivotstream can immediately help you amplify your insights, eliminate wasted effort, make better decisions, and unburden IT. It promises to delight business users, data analysts, and IT leaders at Fortune 100 companies and small work groups every day.
The Pneuron Distributed Platform was architected to combat the traditional challenges that cause business user frustration and extra IT burden. Visual solution configuration maximizes business user accessibility and minimizes repetitious data integration, procurement, and development cycles for IT – while still keeping robust control in the hands of technologists.
PTC enables manufacturers to achieve sustained product and service advantage. The company’s technology solutions help customers transform the way they create and service products across the entire product lifecycle – from conception and design to sourcing and service. Founded in 1985, PTC employs more than 6,000 professionals serving more than 27,000 businesses in rapidly-evolving, globally distributed manufacturing industries worldwide.
You see servers and devices, apps and logs, traffic and clouds. Splunk says it sees data everywhere. Splunk offers the leading platform for operational intelligence. It enables the curious to look closely at what others ignore – machine data – and find what others never see: insights that can help make a company more productive, profitable, competitive and secure.
Tibco is a global leader in infrastructure and business intelligence software. Whether it’s optimizing inventory, cross-selling products, or averting a crisis before it happens, Tibco says it uniquely delivers the Two-Second Advantage – the ability to capture the right information at the right time and act on it preemptively for a competitive advantage. With a broad mix of innovative products and services, Tibco is the strategic technology partner trusted by businesses around the world.
Vitria provides the industry’s leading streaming analytics platform that delivers continuous operational intelligence. Enterprises have deployed Vitria Operational Intelligence to help them uncover, analyze and act on insights from streaming data – in seconds and minutes. With Vitria OI, they can continuously monitor their network and infrastructure, improve their customers’ experience in real-time, engage in more targeted one-to-one marketing to increase customer loyalty and reduce churn, proactively detect and prevent cyber security attacks and fraud, monetize M2M initiatives, and more.
Carl Ford is CEO and community developer of Crossfire Media. (www.xfiremedia.com).
Edited by Ken Briodagh