How to Gain IT Efficiencies With Analytics

As mid-sized companies venture into analytics, one of the best places to start in terms of finding efficiency is right in the data center.

Mary E. Shacklett, President of Transworld Data

March 21, 2016

5 Min Read
Credit: Wikipedia

There have always been tools in the market to help IT achieve operational efficiencies. These tools have ranged from capacity management software to menu-driven software and network configurators, from operational presets in network gear to fourth generation report generators that abstracted report developers from the technicalities of database access and native code development.

But in the age of analytics, new capabilities are coming online that can further assist IT with operational efficiencies. Among them are analytics tools that can tackle the challenges of efficiently managing data center facilities, solutions that can address the now overwhelming task of data management, and software that can wrestle with security concerns that can originate far beyond the walls (and firewalls) of a company. Many large enterprises are moving forward with these new analytics solutions. But for a great number of small and mid-market companies, they are limited because of resource and staff knowledge constraints.

If you are a mid-market company and you want to improve your IT operational efficiencies by taking advantage of new analytics solutions, what are the areas you should be targeting?


Data managementThere are many facets to data management, but the areas today that companies are most concerned with are: accuracy of data, techniques for managing non-traditional data like machine and Internet of Things (Iot) data, optimal access to data that the company continuously needs, and being able to store the volumes of burgeoning data that continuously stream in.

One big data challenge is that big data is messy. It is coming in from many different sources, and the cleanup of these different data types can be onerous and time consuming. It also cuts deeply into project timelines to where it can begin to erode management’s enthusiasm for the projects. In the future, solutions are likely to appear in the marketplace that will be easy enough for business end users to use in a self-service mode from the cloud, and that will enable them to correct data inaccuracies based upon what they already know from their business experience. This technology could cut the amount of ETL (extract, transform and load) work that IT must perform in preparatory stages of big data projects.

Just as important is a new spate of cloud-based machine data and IoT applications that can clean and transform this data into actionable information at a minimum of IT effort. Automated software algorithms operate at sites where machines are located to perform the ETL, only transmitting the finished data product over the network to a central data repository. This lightens the load on bandwidth and the network when data is transferred. “Analysis of machine data at the edge of the network and the ability to clean and transform this data before it travels over the network eliminate two of the biggest pain points in the IoT industry,” said Puneet Pandit, Co-founder and CEO of Glassbeam, a provider of the service.

Finally, almost every major storage vendor now offers automated data storage software that is based on a ruleset for data storage that IT establishes. This ruleset then uses analytics and operates to store frequently accessed data either in-memory or on solid state storage, less used data on hard drives, and data that is seldom used and must be archived or on very slow (but highly economical) hard drives and tape drives.

With the exception of the automated data storage analytics, the best news about all of these data management solutions for small and mid-market companies is that the solutions either are or will be available on a per subscription basis from the cloud. This eliminates investing in more data center hardware and software.


Security There are great and affordable security tools that have already proven themselves and that companies of all sizes use. A lingering concern, however, is finding a way to unify all of these tool silos into a single information source where a holistic view of company security can be achieved. Until recently, IT had to go from tool to tool and then manually piece together an overarching report that covered all security aspects. Now, marketplace solutions provide a layer of softwarethat enables all of these security tools to to plug in. The process eliminates time-consuming IT reviews of security logs for each individual security tool.

Other security tools address the concerns that IT professionals have about public clouds. “The key is to give IT security professionals the same level of security visibility and ability to mitigate in the public cloud as they have on premises,” said George Gerchow, director of product management for security and compliance for SomoLogic, which provides a solution. While tools like this open up security visibility and mitigation in a public cloud, they don’t necessarily reduce the IT workload. Yet they do help IT security professionals sleep at night.


Facility managementA manager at a major West Coast utility once told me that he could immediately tell when a new data center was brought online, because of the immediate spikes in energy usage. Today, sensors can be attached to HVAC units, IT hardware, etc., and then linked into analytics applications that monitor usage and issue alerts whenever certain temperature or usage thresholds are exceeded. This simplifies the coordination work that should be taking place between IT and facilities, but that doesn’t always happen. It also protects data center assets and reduces energy costs.

So what are the takeaways for small and mid-market companies with lean IT budgets?

Start small. In the area of facility environmental control, many sites start with thermostats and humidity monitors equipped with data-sending sensors.

If you are monitoring machine-based data emanating from many remote sites, collecting and cleaning the data locally, and only transmitting data to a central data repository after it is reduced to its necessary elements, will save bandwidth and money.

Consider public clouds as hosts for some of your applications and analytics, but only after you ensure that the security professionals in your organization have the same visibility and ability to mitigate security breaches in a public cloud as they can within your own data center.

Read more about:

2016

About the Author(s)

Mary E. Shacklett

President of Transworld Data

Mary E. Shacklett is an internationally recognized technology commentator and President of Transworld Data, a marketing and technology services firm. Prior to founding her own company, she was Vice President of Product Research and Software Development for Summit Information Systems, a computer software company; and Vice President of Strategic Planning and Technology at FSI International, a multinational manufacturer in the semiconductor industry.

Mary has business experience in Europe, Japan, and the Pacific Rim. She has a BS degree from the University of Wisconsin and an MA from the University of Southern California, where she taught for several years. She is listed in Who's Who Worldwide and in Who's Who in the Computer Industry.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights