Data Center is our focus

We help to build, access and manage your datacenter and server rooms

Structure Cabling

We help structure your cabling, Fiber Optic, UTP, STP and Electrical.

Get ready to the #Cloud

Start your Hyper Converged Infrastructure.

Monitor your infrastructures

Monitor your hardware, software, network (ITOM), maintain your ITSM service .

Our Great People

Great team to support happy customers.

Saturday, January 12, 2013

10 hal tentang Internet of things

10 things you should know about The Internet of Things

By Patrick Gray | January 10, 2013, 7:14 AM PST
The Internet of Things (IoT) has gone from university classrooms and near-science fiction to a common topic in boardrooms and product planning sessions. So what are the 10 key things you need to know about the Internet of Things?

1: What is a “Thing”?

The “thing” commonly referred to by the concept of the Internet of Things is any item that can contain an embedded, connected computing device. A “thing” in the IoT could be a shipping container with an RFID tag or a consumer’s watch with a WiFi chip that sends fitness data or short messages to a server somewhere on the Internet.

2: Why now?

If you’ve been around technology for a while, often what is billed as new and innovative smacks of a past technology that received similar billing. IoT is no exception. You may recall breathless accounts of the coming world of interconnected devices a decade ago with innovations like Java Beans. The primary difference between now and then is the nearly ubiquitous presence of mobile data networks, combined with low-cost and highly capable devices. A decade ago, ubiquitous smartphones and devices like the Arduino would have been unimaginable. Today, they’re commonplace and cost less than dinner for two.

3: All about the data

Just as most social media sites are in the advertising business rather than some selfless notion of human interconnectedness, the excitement around IoT for most businesses is around the data that the IoT can generate. In some of the more traditional cases, like a supply chain filled with IoT devices, data about every product moving through the chain has obvious benefits. But everyone from technologists to marketers imagines far more interesting use of the data generated by an Internet of Things — such as highly detailed, location-specific marketing and consumer products that can “smell” their environment and react accordingly.
So what are some of the common uses of the IoT that are relevant today?

4: Leveling the playing field

A frequently cited use for the IoT is what amounts to telematics data: information about a device’s location and status. While this is nothing new, the fact that what amounts to a state-of-the-art telematics device is now in most people’s pocket in the guise of a smartphone makes several business models attainable to much smaller entities. Recent examples include things like small companies upsetting big city taxi companies by using mobile devices to create ad hoc, unlicensed “taxicab” networks. Cheap, connected hardware is even allowing for industrial applications that were once the province of companies that could afford expensive, custom hardware.

5: More than marketing

While marketing mavens are salivating over the possibilities of gathering detailed demographic and location data from the Internet of Things, I have yet to meet a consumer who is chomping at the bit for more advertisements directed their way — especially ads based on the intimate details of their interactions with products and movements around the world. The companies that will meet with the most success with IoT need to offer more than just a “big brother”-style advertising experience. Perhaps a fitness watch might suggest a trip to the local salad bar when you miss the morning workout or your car might schedule your next oil change based on your location and driving habits. These are true value-added services, not just a blanket of advertising.

6: Self-repairing devices

As devices grow increasingly complex, an ability to proactively diagnose, repair, and provide usage information to manufacturers becomes a competitive differentiator. We’ve already seen the early stages of this innovation, as everything from our phones to our televisions now connect to a network and routinely demand software updates. At the lower end of the spectrum, the costs of these technologies have fallen dramatically, allowing even traditional products to connect to a network and send diagnostic data.

7: Sociology 2.0

Among the less commercial applications of IoT are the opportunities it presents to deepen our understanding of humanity itself. Whether it’s somewhat mundane areas, like tracking critical medicines or food supplies, or more nuanced experiments that might track how an idea or trend spreads among different communities, the concept of having “smart,” traceable devices that can discern and report how humans interact with them presents an amazing amount of potential.
While these are just a few of the potential applications of IoT, the technology is not without its caveats. Here are a couple.

8: The delicate task of unleashing smart “things”

A recent newspaper article raised questions around the legality of common fitness watches uploading heart rate data to fitness portals. With the prices of these devices at commodity levels, even lackluster athletes like myself can record heart rate data during a workout, upload it to a fitness portal, and glean training suggestions and information on how our fitness has improved. The article mentioned that government regulators were regarding this as medical data and questioning whether they should be subject to the same regulations as traditional health records. Imagine being a smaller company that releases a hit fitness product, only to find that the government now wants to treat you as a medical device maker. Ouch.

9: The devil is in the details

IoT devices combine a multitude of disciplines that are different from conventional products. While your company may already have competencies in IT and technology management, are you ready to embed deep IT capabilities into every product? Can your IT organization that’s used to supporting internal email users handle thousands of calls when you botch a firmware update and effectively kill your product? Is your legal team ready to defend against class-action lawsuits or consumer backlash to government regulators? Just because you can drop a connected chip into your product doesn’t necessarily mean you should.

10: So are we really there yet?

As mentioned, almost since the dawn of microelectronics there has been a notion of interconnected, smart devices. The technology and networks are ready now, but there are still quite a few question marks, including political and societal readiness to allow our devices to increasingly report on our activity and whether companies can capture and interpret the massive amounts of data that IoT will generate.

More resources

For a comprehensive look at the issues and technologies surrounding the Internet of Things and the emerging Machine-to-Machine (M2M) ecosystem, check out ZDNet’s latest feature page, Tapping M2M: The Internet of Things.

Panduan M2M lengkap..

M2M and the Internet of Things: A guide

Summary: The Internet of Things will consist primarily of machines talking to one another, with computer-connected humans observing, analysing and acting upon the resulting 'big data' explosion. Here's how the next internet revolution is shaping up.
In its initial phase, all of the internet's IP addresses were assigned to computers of one sort or another. Some of these were servers, and a growing number were clients that mostly consumed (but could sometimes modify) content on those servers.
As the internet — and in due course the worldwide web — developed, more kinds of (increasingly mobile) computing devices became connected, and web servers delivered ever richer content with which they could interact. Although this first internet/web revolution changed the world profoundly, the next disruptive development, in which the majority of internet traffic will be generated by 'things' rather than by human-operated computers, has the potential to change it even more.
There's many a slip between
a potential brave new technological world and a reality that could improve the quality of life of a significant proportion of humankind.
This 'Internet of Things' (IoT), or more prosaically 'Machine to Machine' (M2M) communication, is well under way — after all, microprocessors are to be found in all manner of 'things': domestic white goods, cars, credit cards, your passport, your family pet, the CCTV camera in your street, the lift (elevator) in your office and many more. Add the magic ingredient of internet connectivity (or the ability to be read by an internet-connected device), bake with applications and services that make use of the data gathered by this vastly expanded network, and you've cooked up another technology revolution.
As the authors of the excellent Trillions: Thriving In The Emerging Information Ecology put it: "The data are no longer in the computers. We have come to see that the computers are in the data".
However, as the aforementioned book discusses at length, there's many a slip between a potential brave new technological world and a reality that could improve the quality of life of a significant proportion of humankind. Whether the Internet of Things comes to pass in a satisfying way will depend critically on how the emerging M2M ecosystem is architected.

The anatomy of M2M

Any new field comes with its own concepts and jargon, so it's useful to map these out as clearly as possible: our taxonomy is outlined below.
A point worth stressing is that data transfer patterns in the M2M-driven Internet of Things will differ fundamentally from those in the classic 'human-to-human' (H2H) internet. M2M communications will feature orders of magnitude more nodes than H2H, most of which will create low-bandwidth, upload-biased traffic. Many M2M applications will need to deliver and process information in real time, or near-real-time, and many nodes will have to be extremely low-power or self-powered (eg. solar powered) devices.
ThingsThe 'things' in the IoT, or the 'machines' in M2M, are physical entities whose identity, state (or the state of whose surroundings) is capable of being relayed to an internet-connected IT infrastructure. Almost anything to which you can attach a sensor — a cow in a field, a container on a cargo vessel, the air-conditioning unit in your office, a lamppost in the street — can become a node in the Internet of Things.
These are the components of 'things' that gather and/or disseminate data — be it on location, altitude, velocity, temperature, illumination, motion, power, humidity, blood sugar, air quality, soil moisture... you name it. These devices are rarely 'computers' as we generally understand them, although they may contain many or all of the same elements (processor, memory, storage, inputs and outputs, OS, software). The key point is that they are increasingly cheap, plentiful and can communicate, either directly with the internet or with internet-connected devices.
Comms (local-area)
All IoT sensors require some means of relaying data to the outside world. There's a plethora of short-range, or local area, wireless technologies available, including: RFIDNFCWi-FiBluetooth (includingBluetooth Low Energy)XBeeZigbeeZ-Wave and Wireless M-Bus. There's no shortage of wired links either, including Ethernet, HomePlugHomePNAHomeGrid/ and LonWorks.
Libelium's customisable Waspmote sensor/comms board (left) and the Waspmote Plug & Sense enclosure (right), with connections for sensors, antennas, a solar panel and USB PC connectivity.
Comms (wide-area)
For long range, or wide-area, links there are existing mobile networks (using GSM, GPRS, 3G, LTE or WiMAX for example) and satellite connections. New wireless networks such as the ultra-narrowband SIGFOX and the TV white-space NeulNETare also emerging to cater specifically for M2M connectivity. Fixed 'things' in convenient locations could use wired Ethernet or phone lines for wide-area connections.
Some modular sensor platforms, such as Libelium's WaspMote (left), can be configured with multiple local- and wide-area connectivity options (ZigBee, Wi-Fi, Bluetooth, GSM/GPRS, RFID/NFC, GPS, Ethernet). Along with the ability to connect many different kinds of sensors, this allows devices to be configured for a range of vertical markets.
Server (on premises)
Some types of M2M installation, such as a smart home or office, will use a local server to collect and analyse data — both in real time and episodically — from assets on the local area network. These on-premise servers or simpler gateways (right) will usually also connect to cloud-based storage and services.
Libelium's Meshlium gateway, which includes local storage, and a diagram of a ZigBee sensor network.
 Local scanning device'Things' with short-range sensors will often be located in a restricted area but not permanently connected to a local area network (RFID-tagged livestock on a farm, or credit-card-toting shoppers in a mall, for example). In this case, local scanning devices will be required to extract data and transmit it onwards for processing.
Storage & analyticsIf you think today's internet generates a lot of data, the Internet of Things will be another matter entirely. That will require massive, scalable, storage and processing capacity, which will almost invariably reside in the cloud — except for specific localised or security-sensitive cases. Service providers will obviously have access here, not only to curate the data and tweak the analytics, but also for line-of-business processes such as customer relations, billing, technical support and so on.
User-facing services
Subsets of the data and analyses from the IoT will be available to users or subscribers, presented (hopefully) via easily accessible and navigable interfaces on a full spectrum of secure client devices.
M2M and the Internet of Things has huge potential, but currently comprises a heterogeneous collection of established and emerging, often competing, technologies and standards (although moves are afoot here). This is because the concept applies to, and has grown from, a wide range of market sectors.

M2M sectors

How is M2M being used, and what are its applications in the future?
Perhaps the canonical example of the Internet of Things (and the stuff of many a cheesy futurist visualisation) is the 'smart home'. The components include sensor-equipped white goods, security, lighting, heating, ventilation and entertainment devices, among others, all connected to a local server or gateway, which can be accessed by the appropriate service providers — and, of course, the home owner.
Link  AlertMe
Healthcare is another prominent M2M application, and comes under various banners including e-health, m-health, telemedicine and assisted living. Patients with non-life-threatening conditions can be issued with sensors (for blood pressure, or blood sugar levels for example), sent home and monitored remotely by medical staff — and can often be shown how to interpret the data themselves. This will free up hospital beds and physicians' time for more urgent cases. More generally, consumer-oriented sensors such as the Fitbit can encourage people to adopt healthier lifestyles, helping to keep them out of the doctors' surgeries and hospital beds in the first place.
The smart home is a subset of the 'smart building' — which could be an office, a hotel, a hospital, a manufacturing facility, a retail store or any other public structure. All such buildings consume energy through heating, ventilation and air-conditioning (HVAC) systems, and building automation systems can capture and analyse data from all relevant equipment, allowing cost-saving energy solutions to be created and implemented. Depending on the particular building, other subsystems that can be 'smartened' include structural health, access control and security, lighting, water, lifts, fire and smoke alarms, power and cooling for IT infrastructure.
Given the resources consumed by today's buildings (40 percent of the world's primary energy, according to The World Business Council for Sustainable Development), the potential monetary savings and environmental benefits on offer in this sector are immense.
Link  Smarter Buildings (IBM)
There are many reasons why 'smart' manufacturing is a good idea: digital control systems, asset management and smart sensors can maximise operational efficiency, safety and reliability, while integration with smart building systems and smart grids can optimise energy consumption and reduce carbon footprint. And, of course, the smarter the manufacturing process, the quicker it can respond to changing customer demand. It's no surprise to find that smart manufacturing is seen by western politicians as a way of increasing competitiveness in global markets, although there's no technical reason why Chinese manufacturers, for example, couldn't adopt the same processes.
Automotive & transport
Today's cars routinely bristle with sensors and computing equipment, covering everything from engine management to navigation to 'infotainment'. Automobiles are rapidly becoming connected, context-aware machines that know where they are, where other vehicles are (both locally and in terms of regional traffic), who is driving (via driver face recognition) and how they are driving, and can warn of impending mechanical or other problems, and automatically summon roadside assistance or emergency services if necessary. A 'smart' car can be remotely tracked or immobilised if stolen, and new business models such as 'pay-as-you-drive' insurance can be implemented.
A lamp-post mounted (solar-powered) sensor/comms enclosure and an illustration of its use in traffic monitoring in a 'smart city' environment. (Images: Libelium)
The roads the cars drive on will become smarter too: in towns and cities, lamp-post-mounted sensors can monitor parking spaces, for example, and also warn drivers of congested areas.
Supply chain
Given that passive RFID tags cost only a few cents, it's no surprise to find that M2M technology features heavily in supply chain management: the ability to track, in real time, raw materials and parts through manufacturing to finished products delivered to the customer has obvious appeal compared to patchy data delivered by irregular human intervention. Fleet management systems have long made use of GPS tracking, but cellular-equipped sensors can also monitor the condition of sensitive consignments (temperature for perishable food, for example), or trigger automatic security alerts if a container is opened unexpectedly.
The sharp end of the supply chain — retail — is fertile ground for M2M technology, applying to areas such as in-store product placement and replacement, kiosks and digital signage, vending machine management, parking meters and wireless payment systems.
Link M2M Retail Solutions (Verizon)
Field service
Consumer devices, business equipment and industrial plants can all, obviously, suffer faults that require repairing. If these things are all 'smart', delivering real-time status reports to the internet, then field-service operations can be booked quicker, engineers can be equipped with the correct parts and manuals, and site visits can be scheduled efficiently.
Utilities: smart metering and grids
Smart meters for electricity, gas and water, and the smart grids they create, form a major component of the M2M market. Real-time data on resource consumption down to the household level allows utilities to manage demand and detect problems efficiently, while householders can save money by optimising their usage patterns.
Security & surveillance
Most people are rightly wary of the Orwellian aspects of widespread automated security and surveillance technology, but there are also plenty of benefits to be had. Smart buildings, including smart homes, can have connected smoke detectors that alert emergency services when triggered, and activate only the appropriate suppression systems; connected burglar alarms can immediately identify the point of entry and motion sensors can track an intruder's progress in real time (the same sensors can identify and track legitimate occupants via wireless access-control systems).
Environmental monitoring
M2M technology has great potential when it comes to monitoring natural or man-made environments. Suitably placed sensors can provide early warning of pollution, forest fires, landslides, avalanches and earthquakes, for example. More generally, air, water and soil quality can be remotely monitored in places of interest, and changes in the abundance and distribution of key species (wildlife or pests) tracked and changes to their habitats logged.
Smart agriculture is a growing field (as it were), with M2M technology available to track the location and condition of livestock, monitor the growing conditions of crops, and optimise the performance of farm equipment (using precise geolocation to minimise wastage in crop-spraying operations, for example).
High-value crops can be monitored by wireless sensors for a range of parameters (air temperature, humidity, soil temperature, soil moisture, leaf wetness, atmospheric pressure, solar radiation, trunk/stem/fruit diameter, wind speed and direction, and rainfall), with real-time data gathered by an on-site gateway, sent to the cloud and accessed via internet-connected PCs or smartphones. This information allows irrigation and other agricultural interventions to be precisely matched to local growing conditions.
Any world-changing technology is likely to have its darker applications, and M2M is no exception. Many military applications simply involve ruggedised and security-hardened versions of existing technologies, and this will apply to M2M as much as any other sector. Areas of particular interest to those in uniform are likely to be security and surveillance, transportation and logistics, healthcare and environmental monitoring.
Links  M2M Gains Military Traction / Blueforce Development (Tactical Response, Emergency Medical)

M2M uptake: who is using it, and who is next?

Machine-to-machine (M2M) technology is growing in importance — but which industries have already adopted it, which are likely to, and how big is the market?
Machine-to-machine communication is seen by technologists, analysts and major companies across the world as the next great tool to revolutionise business. However, predictions for the size of the market vary and uptake, so far, has been limited.
In 2004, BusinessWeek predicted that M2M would be a $180bn market by 2008. If you believed that, you'd have been disappointed by a 2007 report from The Economist putting it at around $35bn. By 2010 the market had climbed to $120bn, according to information from M2M specialist Machina Research — two years late and still $60bn off the original BusinessWeek projection. The latest Machina Research report predicts the M2M market will grow from $200bn in 2011 to $1.2 trillion in 2022:
Any advanced technology is prone to false starts and an excess of hype. Wildly optimistic predictions were made for Segway scooters, for example, but the mass market never materialised. Similarly, we've been told for years that fusion power, quantum computing, strong artificial intelligence, robotic cars and electric vehicles are just around the corner. Again, none of these technologies have yet fulfilled their promise.
M2M is certainly happening, but the market is fragmented into numerous verticals. Right now there are around 110 million M2M devices connected to the internet, according to Juniper Research. By 2017 this is expected to climb to 400 million. The numbers bandied about obviously depend on the definitions used, however: Machina Research, by contrast, puts the number of M2M connections at the end of 2011 at two billion, and expects this to grow to 18 billion by 2022.
M2M is the next ubiquitous technology. Get ready
According to Frost & Sullivan, the areas driving this growth will be the automotive industry, with new 'smart' cars; utility companies with smart grids; healthcare and security, along with home automation. Machina Research, meanwhile, puts the top growth-driving vertical markets in the following order: intelligent buildings, consumer electronics, utilities, automotive and healthcare.
According to Cisco, the next nine billion or so devices connected to the internet in 2020 will use M2M technologies. Many of these devices will be used to link the physical world to the internet via sensors that take readings from their local environment and output the information up into the cloud.
For this reason, the entire field is being forced to grapple with questions around data preservation, communication and integrity — and far earlier than other similar technology sectors have had to.
Estimates of the size of the M2M market and its likely growth vary, but the widespread influence that this technology will undoubtedly have is concentrating the minds of all kinds of companies. Those whose M2M strategies succeed will have as much sway over our lives as smartphone vendors and mobile operators do today. M2M is the next ubiquitous technology. Get ready.

The Internet of Things, powered by Machine-to-Machine communication, is already with us, but remains a massive opportunity. Properly implemented, it can retool large parts of the world for betterefficiency, security and environmental responsibility — and of course it can generate potentially huge amounts of business for the IT companies that will build and run the systems involved.
Many technology sectors stand to benefit from this new world order, including mobile network operators and fixed broadband providers, system integrators, cloud service providers, mobile app developers, sensor and wireless infrastructure vendors, and purveyors of Big Data infrastructure and analytics.
In an ideal world, M2M equipment will interoperate smoothly, service providers will compete on a level open-standards playing field without attempting to lock customers into their ecosystems, and the Internet of Things will develop with the same explosive inventiveness as did the original internet. The remaining articles in this series will explore how likely that is to happen, and present some examples of M2M in action.

Friday, January 11, 2013

Apakah kita telah melihat akhir dari evolusi Data Center ?

Have we seen the end of Data Centre Evolution?

The data centre as we know it has undergone several transformations over the last few decades and still continues to do so. Having come so far in the evolution of the data centre, something tells me that we’re still not done.
With the Gartner Data Centre events taking place in London and the United States this week, I cannot help but wonder how the Data Centre has undergone several stages of evolution to command the level of importance that it currently does in any organisation that uses IT. It is essentially the heart and nervous system of any IT system, from where IT services are delivered to business users. 
Despite appearances, the data centre industry has an interesting history. In the 1960s, data centre mainframes housed the super-computers (such as the Cray-1) designed for high end computing. Running applications for weather forecasting, molecular analysis, quantum physics and even nuclear simulations were commonplace on these massive machines. These were expensive resources, which only a select few organisations in the country could afford to own and maintain. That changed with the emergence of the micro-computer and better storage technologies in the 1980s, and data centres transformed into server rooms where specialised applications ran.
The real popularity and need for data centres started booming in the dot-com era of the 1990s, when organisations began to require high speed computing power and internet connectivity to host applications on. Due to prohibitive hardware costs, it was practical to rent out such equipment rather than invest capital in it. This led to the emergence of the Web-hosting providers and Internet Service Providers, who could offer such services on a dedicated or co-location model.
Fast forward to today, modern day data centres, responsible for offering applications and platform as a service, have to be designed for massive data volumes at speeds and latencies beyond traditional architectures. Today’s web applications for business, social networks, and mobile devices demand it. A few emerging technologies such as Data Centre fabric promise to help data centre architects meet these objectives by transforming silos of computing and storage resources into a shared and integrated pool of resources and improve application latency.
Technologies such as Virtualization have driven server consolidation in the data centre. An industry peer once told me that they had firm directives from top management to consolidate all their servers from various divisions to the data centre. While the objective was to ‘push’ servers from all-over-the-enterprise to the data center, there was no importance attached to how to model the data centre architecture efficiently to meet this objective. The analogy my associate gave was: “They cleaned up all the mess in the house, but put it in the attic where it’s not visible to outsiders. However, the fact remains that the attic was still in a mess.” What he meant was, the complexity had been moved from all over the place to the data centre, becoming a nightmare for the data centre manager and his ops team. Reasons for this included, heterogeneous network equipment which did not work well together, servers that did not lend well to virtualization, plethora of management tools and a data centre architecture that did not serve applications fast enough to the end user. I completely agree with this view, data centre consolidation is not just about throwing all the servers in a room and virtualizing it - that’s a start but it doesn’t stop there!
Some organisations see data centre consolidation as a tactical way to save costs, but I believe it is a far more strategic goal that can help IT deliver new emerging services in an agile and cost effective manner. If architected correctly, data centres should help reduce power, real-estate, cooling and cabling costs, improve application latency, ensure better resource utilisation via dynamic and flexible resource provisioning. As part of data centre architecting, one of the foremost things to keep in mind is to simplify the data centre architecture and the tools to manage it. Upcoming advances such as Software Defined Datacentres, unified fabric technologies and Integrated IT Management techniques are key to simplifying the data centre and the path to ‘IT Nirvana’.
Providing such agile and flexible resource allocation along with better accounting of utilized resource and chargebacks will enable IT to operate as a profit centre providing efficient and cost effective services. That in all probability will earn IT a seat at the decision making table.  
By Sridhar Iyengar, Vice President, Product management, ManageEngine

Prediksi Gartner terkait Cloud, Social, Mobile sepanjang 2013

Gartner Predicts Cloud, Social, Mobile, and Information Forces Will Shape 2013
According to research firm Gartner, several converging forces will influence IT in the coming year.
Succeeding as an education IT leader in the new year will call for becoming comfortable with the undercurrents of several converging forces--cloud, social, mobile, and information--that are shaping the look of IT. That's the analysis of Gartner, which recently released its predictions for 2013 in a public webinar now available as a recording. Although the presentation addressed all organizational segments and both buyers and vendors of IT products and services, several of the company's predictions are especially relevant to the education sector; those are the ones we focus on here.
According to Daryl Plummer, managing vice president and a Gartner Fellow, cloud represents a "global-class delivery model so we can get services to those who need them without worrying about implementation." Social provides an alternative to the traditional team structure, allowing people to interact in ways that are "more like human beings interact." Mobile is a "pervasive access mechanism" for being able to get to and interact with "whomever, wherever, and whenever." The fourth force, information, is the growth of the "big content store"; that store contains not only data, but also "the context of the people who have used that data--where are they located, who do they work with, what kind of actions they take in certain situations.
"When you get all of this coming together on a backdrop of consumerization of the IT function, where business people and individuals are making their own decisions about technology, you get a new world," he stated.
Prediction 1: Data Will Fuel the Killer Apps of the Future
As information surfaces as a dominating force of business, the demands of big data work will grow to 4.4 million jobs around the world by 2015. But only a third of those jobs will be filled, reported Gartner, pointing to a fundamental change in the skills required to manage big data.
"Big data" Plummer noted, encompasses data that arrives at a "high velocity, from many different sources, with many different types of data--structured and unstructured--all coming at you in some kind of pseudo real-time." This work differs from the data work that dominates now, primarily business intelligence and data warehousing. Those are "generally a static view of information. It is a mining of what you have already and making decisions from it," he said. "We're talking about an active view, a dynamic view, where data is streaming at you in real time, [and you're] using that data in making judgments and decisions in interfacing with a customer in real time, in marketing to customers. Who you're going to market to is going to change because of it."
Finding qualified staff presents the challenge; but the opportunity is to become one of those data experts. As Plummer explained, the massive skills gap will exist because the new roles cross both IT and business and call for experience in information management and analytics as well as business expertise. These jobs include business analysts, chief data officers, data scientists, information architects, and legal and IT professionals whose primary work will be "capturing, analyzing, visualizing, discriminating in, and making decisions from data." Analytics and visualizations, he declared, "are the killer apps of the future."
To stay ahead of that coming gap, Plummer also recommended that managers reconsider their current unfilled positions and ask whether they're truly the right roles for the new era of big data. "It's a time to start asking questions. Your open requisitions for jobs--what jobs are they trying to fill? You need to create a new set of requirements."
Prediction 2: Security Concerns Pervade Mobile Use
Increased use of mobile devices by employees to do social activities through the cloud will result in security and privacy controls that are insufficient. Because employees are increasing their use of collaboration applications such as social networks on mobile devices, Gartner predicts that by 2017, "40 percent of enterprise contact information will have leaked into Facebook."
These micro instances of data breaches are happening for a couple of reasons, Plummer said. First, Facebook has become one of the top five apps installed on smart phones and tablets. Second, disparate social sites and enterprise services are increasingly being linked by users; identity management set up for one site is used to gain access to another site. For example, Microsoft's Social Connector for Outlook allows for integration with Facebook. As a result, he noted, "The integration and painless movement of information from applications to Facebook and others is now automatic and almost invisible."
What Plummer doesn't advise is banning Facebook. "If anyone out there has a policy that bans access to Facebook or Twitter, please stop doing that," he said. "It's a waste of your time. And it's killing your credibility with your users." IT can't stop people from getting into social sites from their own devices. The better approach is to find ways to use social to "foster communication across your company, to foster dynamic communities."
At the same time, bring-your-own-device programs are introducing increased risk into the organization. Gartner predicts that through 2014, employee-owned devices will be compromised with malware at more than double the rate of devices owned by the organization. While that may seem like an obvious outcome of unfettered personal device usage, Plummer observed, it's a fact that needs to be communicated to the user community. "Most people don't realize that while three to five percent of corporate endpoints are compromised by malware, 20 percent of consumer-grade endpoints are compromised by malware."
Most people--including IT staff--don't even realize how many of those endpoints are sitting inside the firewall with direct access to network resources. "Stopping entry to your network is something you've worked on forever," Plummer pointed out. "Now we're talking about compromised devices in massive numbers sitting behind our firewall."
Rather than try to stop the growth of BYOD or invest IT dollars in the purchase and distribution of devices that will somehow be controlled more tightly, Gartner advises its clients, take a lesson from higher education: Segment the network. Users on computing devices who are outside of IT management and control lose the ability to access sensitive network resources, but they can continue "innovating and getting things done" with their smartphones and tablets.
Then IT can redeploy resources into "security initiatives aimed at detecting and preventing the spread of malware," Plummer said. These include technologies such as mobile device management products that can implement policies to prohibit unsafe activities such as the transfer of data like e-mail.
However, he added, few solutions--even including complete lockdown--are totally effective. His advice: Focus on delivering an enterprise code of conduct policy, "to keep users responsible for the loss of enterprise information. These codes of content are much more effective than lockdown in the modern age."
Prediction 3: Gamification Joins the Mainstream
Numerous education organizations have experimented with the use of gamification to enhance student engagement. Now it's crossing into the mainstream as a way to engage workers too. The same techniques used by game designers to keep game players coming back for more--feedback, measurement, and incentives--will be used to keep employees interested in their work. Noting that 30 percent of business transformations fail due to a lack of stakeholder engagement, Gartner predicted that by 2015, 40 percent of global 1000 organizations will use gamification "as the primary mechanism to transform business operations." In fact, at some point in the next several years, the company suggested, enterprise gamification will surpass consumer game-playing.
"The way people get engaged is a very critical thing to how well they are actually able to work with the systems or processes you're trying to engage them on," Plummer explained. Gamification is at the heart of Innovation Station, an idea generation service run by IdeaScale and used by Davenport University, a private college in Michigan. Participants who submit ideas, comment, or vote earn points based on their actions, which can lead to badges and recognition on a leaderboard.
To succeed with gamification, however, IT needs to avoid becoming complacent. "Understand what works in a particular culture and plan for iterations and upping the 'game' to avoid fatigue and foster continued engagement," advised Gartner.
Prediction 4: Augmented Reality Takes Shape
Context-based information that appears at the point of a decision or action forms the basis of augmented reality, explained Plummer. "Think about an app on your Android device that allows you to point your camera and see the price of an apartment, or how many jobs are open, or what companies are in that building. You wave it at a restaurant and see messages from your friends saying, 'This restaurant is good. Try the lasagna.' You see increasingly, the idea of using mobile devices that you can wear like glasses." A "glanceable interface" could overlay hints and instructions on an activity as it's being performed.
Although the majority of revenue from wearable smart electronics is initially surfacing from applications such as activity tracking through athletic shoes, eventually wearable electronics will be used to improve worker productivity, asset tracking, and workflow, Gartner predicted. By 2016 augmented reality will be a $10 billion industry.
Windows 8 for Enterprise?
Even as organizations were in the midst of upgrading to Microsoft Windows 7, the company released Windows 8. Gartner predicted that 90 percent of enterprises will bypass deployment of the new operating system, which, Plummer said, was introduced to respond to the dominance of mobile computing, the rise of cloud services, and the ascendency of Apple and Google. Windows 8 is a transition product, best suited for tablets and convertible devices. From here on out, he added, Microsoft's philosophy will be "mobile first, not PC first." What Windows 8 won't do, he said, is "knock the iPad off its perch."
The "sweet spot" for Windows 8 adoption, Plummer noted, will be in situations where "the worker must perform tasks while standing and walking," not for desktop operations where a keyboard is in constant use.
About the Author
Dian Schaffhauser is a writer who covers technology and business for a number of publications. Contact her at