Menu

IoT FEATURE NEWS

Think you're ready for Big Data and IoT? Testing times are ahead

By

Rapid time-to-market is increasingly important in the rollout of new applications and services, or to put it in simpler terms:  everyone wants to be first.  So new architectures are planned and implemented with virtual environments and hybrid clouds, only to find out that customers are complaining about a loss of VoIP quality, and online gamers for long ping times.  Waiting for customers and users to complain is one of three basic ways to learn about the performance and resilience of your network, but certainly not the most promising.  The second option is waiting for a hacker attack to paralyze your network, and that’s not popular either. The third option is called ‘testing’.

However, not all test methods are suitable for ensuring the availability of services and applications.  Trying to validate performance and security without being realistic about application loads and attack techniques, quickly leads to a false sense of security.  Only tests based on real-world expected load conditions – and beyond what you might expect – will give reliable information about how the network and security infrastructure behaves.

Start at the beginning
It’s forecast that by 2020, there will be about 50 billion devices connected to the Internet, 10 times more than there are today(1).  Many of these devices run complex applications that need to communicate with each other around the clock.  This not only automatically generates more data, but also places greater demands on the performance and availability of networks.  In particular, HD video, and social networking, combined with big data and IoT have a virtually unlimited hunger for bandwidth.  

Attacks are also getting bigger.  In a report published in January 2016, the European Agency for Network and Information Security (ENISA) stated that the number of DDoS attacks with bandwidths over 100Gbps had doubled in 2015, and will continue to increase.

Meeting these growing demands on infrastructure requires a massive upgrade to the data center, ranging from migration of their top-of-rack to server connectivity from 10 GbE to 25 GbE and 50GbE, to enhancing the core network with 100 GbE technology. The expected result of this type of upgrade is significantly higher data rates with approximately the same footprint and power consumption, as well as a higher server density and reduced cost per bandwidth unit. But what guarantees do enterprises have that these expectations will be achieved under real world conditions?

In addition, unique characteristics of network devices, storage, and security systems, coupled with the virtualization of resources, the integration of cloud computing, as well as SaaS, can significantly slow the introduction and delivery of new services. To ensure you get the throughput needed to deliver new services anytime, anywhere, requires infrastructure tests that go above and beyond standard performance tests of individual components.

Customers and internal stakeholders do not care how many packets a web application firewall can inspect per second.  They only care about the application response time, which depends on a number of factors.  These include the individual systems in the network and their interaction, the application specific protocols and traffic patterns, as well as the location, and time of day, of the security architecture.  So it’s imperative to test the entire delivery path of an application – end to end – under realistic conditions. This means using a mix of applications and traffic workloads that recreates even the lowest layer protocols. Simple, standardized tests such as IO meters in complex environments are not enough.

Testing under real conditions
Enterprise data centers need a test environment that reflects their real load and actual traffic, including all applications and protocols, such as Facebook, Skype, Amazon EC2 / S3, SQL, SAP, Oracle, HTTP or IPSEC.  It’s meaningless, and dangerous, to test a data center infrastructure with 200 Gbps of data, when the live network experiences peak loads of over 500 Gbps. Additionally, when testing, consider illegitimate traffic including increasingly frequent DDoS and synchronized attacks on multithreaded systems.  Since attack patterns are constantly changing, timely and continuous tests are crucial.  One way to ensure the consistency and timeliness of testing is to leverage an external service that can analyze current attack patterns and update the test environment continuously and automatically.

Testing complex storage workloads can only be achieved with real traffic. Cache utilization, deduplication, compression, as well as backup and recovery, must be tested with all protocols used – SMB2.1  / 3.0, NFS, CIFS, CDMI or iSCSI – and optionally tuned to ensure compliance with defined service levels.

While the need for stringent testing is obvious for a new data center, it’s equally important when consolidating or integrating hybrid clouds.  This is because each new application, and even updates and patches of existing applications, can significantly alter the performance and response times of the network.

DIY or TaaS?
In addition to the development and testing of a network infrastructure, it’s equally important to develop a qualified test team.  Enterprises do not typically hire dedicated test engineers, and network and security architects are not always proficient in designing and executing comprehensive tests to ensure their applications and IT systems can handle big loads and sophisticated attacks.

As such, external TaaS (Testing as a Service) offerings can be a useful addition to an in-house solution, especially for larger projects. An external service provider can help determine which systems are the best fit within an existing environment, or before the rollout of a new demanding application such as online gaming.  

So the choices are simple:   wait for customer complaints to learn about the performance and resilience of your network; wait for a hacker attack to paralyze your network; or put your network and applications to the “real” test with solutions and offerings that replicate your specific load requirements. No brainer.

About the Author: Areg Alimian is senior direct for solutions marketing at Ixia. Areg is an entrepreneurial executive with over 18 years of industry experience with leadership roles in marketing, product development and business leadership. He is recognized for his ability to conceptualize, develop and grow breakthrough products in new markets and domains including information security, streaming and social media, network test, Wi-Fi, and network performance management. 




Edited by Ken Briodagh
Get stories like this delivered straight to your inbox. [Free eNews Subscription]


SHARE THIS ARTICLE
Related Articles

Rising Edge Computing Investments to Reach $350B by 2027, According to IDC

By: Alex Passett    3/27/2024

Worldwide spending on edge computing is expected to surge (and then keep going) for the foreseeable future, according to the International Data Corpor…

Read More

ZEDEDA Adds Lisa Edwards as New Board Member, Seeks Opportunities to Strengthen Operations and Scale

By: Alex Passett    3/26/2024

Earlier this morning, ZEDEDA announced the addition of Lisa Edwards to its board of directors.

Read More

An Existing IoT Collab, Emboldened: Digi International and Telit Cinterion Transform Solutions with 5G RedCap Integration

By: Alex Passett    3/25/2024

The ongoing industry collaboration between Digi International and Telit Cinterion signals strong support for the mainstream showcasing of 5G for IoT a…

Read More

Telit Cinterion's 5G LGA Modules, Powered by Snapdragon from Qualcomm, to Create a Big Leap in IoT Connectivity

By: Alex Passett    3/25/2024

Telit Cinterion recently unveiled its FE990B34/40 LGA family of modules, powered by the Snapdragon X72 5G Modem-RF System from Qualcomm Technologies, …

Read More

Embracing Innovation in Mining: The Role of Network-Aware Applications in the Digital Transformation

By: Special Guest    3/21/2024

Shabodi leverages private 5G network capabilities and enables the development of network-aware applications to enhance operational efficiency, automat…

Read More