Customers need and expect access to current information. In the world of transit, this means access to updated schedules, notifications of delays or changes, best routes to use and transfers to other systems. And, in a Web 2.0-enabled world, customers expect this information to be available at their fingertips, whenever and wherever they need it, no matter the type of Internet connection or end device they use. Emails or text messages explaining detours and delays, “tweets” telling riders what time the next bus will arrive at their stop, Web access to the latest schedules and information, YouTube videos explaining safety and service features, video feeds showing on-site conditions … Transit must work to incorporate emerging Web technologies into daily operations in order to fit seamlessly into the daily lives of our customers.
The challenge, of course, is applying emerging technologies to information delivery while using legacy infrastructures. According to some industry estimates, many agencies spend more than 70 percent of precious IT resources on these aging systems. Further compounding the problem is a lack of training resources to help employees learn the new technologies.
With the growth of emerging Web technologies as powerful operations and customer service tools, virtualization technologies are gaining favor as the key to managing and optimizing the datacenter. Virtualization can deliver applications to end users and ultimately can improve the experience of both employee and customer alike — without a complete overhaul of the legacy infrastructure. Virtualization also improves secure access to networks and data, and can help the transit organization to “go green” in new and more pronounced ways.
Optimizing the datacenter
The datacenter is the heart and soul of transportation technology, which depends on it to provide a quality service to the customer. The ability to leverage emerging Web technologies to deliver better, faster, safer data and applications to both employees and customers ultimately depends on the ability of the datacenter to support those technologies. However, the patchwork solution of just adding more physical servers to increase capacity and facilitate emerging technologies creates server sprawl. These datacenters become overloaded with servers that, most likely, are not even running near full capacity though they place a huge strain on personnel, as well as financial and environmental resources. I am aware of one server setup for a specific application that used only 7 percent of the server’s capacity. Although probably a good decision at the time, it resulted in an unnecessary waste in today’s virtual world.
Fortunately, worldwide averages fare a little better, with about 40 percent of server capacity actually in use. But this still represents a huge investment in a greatly underutilized infrastructure.
The traditional datacenter requires enough power and cooling alone to make it a significant drain on the bottom line. And, because we often have different servers running different versions of the same applications, we create an unstable environment in need of additional time and resources just to keep everything up and running — which only compounds that poor server performance. All of this adds up to a datacenter that inhibits, instead of facilitating, improved service to the business and the customer.
Virtualization technologies can unleash the power of the datacenter, optimizing datacenter performance while improving agility and minimizing maintenance hours and costs. Data remains protected in the datacenter, safely behind the firewall, while images of the data are transmitted to end users over the network. The end user experiences the data and applications as if he or she is accessing them directly but in reality, only mouse clicks and pixels actually travel on the network. All computing is executed on the server, in the datacenter — a more secure way to handle sensitive data while improving the ability to push information out to customers and ultimately improve customer service.