Data Center is our focus

We help to build, access and manage your datacenter and server rooms

Structure Cabling

We help structure your cabling, Fiber Optic, UTP, STP and Electrical.

Get ready to the #Cloud

Start your Hyper Converged Infrastructure.

Monitor your infrastructures

Monitor your hardware, software, network (ITOM), maintain your ITSM service .

Our Great People

Great team to support happy customers.

Saturday, October 18, 2014

Technology mendorong Disaster Recovery & Business Continuity



How New Technology Can Boost DR and Business Continuity

clouds-concept
The evolution of the data center has brought many new enhancements that can impact your disaster recovery and business continuity planning.
Let’s talk a little shop today. One of the hottest conversations many business managers are having is how their organization can use the data center as a key element for their disaster recovery and business continuity strategies. Cloud computing, data replication and virtualization all play a major role in the disaster recovery and business continuity (DRBC) discussion. Still, the evolution of the data center and new types of resources requires administrators to take a look at their strategies and see where they can improve further.
Although business continuity and disaster recovery can overlap, they are really different IT objectives. With that in mind, the conversation about data center DR strategies has really evolved over the past few years. Where it was once reserved for only the big shops or ones with a lot of dollars to spend, modern IT infrastructure allows a broader range of companies to do a lot more, for a lot less. Smaller organizations are now leveraging private and public cloud environments for their DR needs. In fact, this influx of new business is part of the reason that many data center providers are seeing a boom in services requests.
What are the real driving factors here?
  • Global traffic management (GTM) and global server load balancing (GSLB). The truth is simple: Without these types of modern global traffic controllers, it would be a lot more difficult to replicate and distribute data. Just take a look at what technologies like F5, NetScaler and Silver Peak are doing. They are creating a new logical layer for globally distributed traffic management. Not only are these technologies optimizing the follow of traffic, they are controlling where users go and what resources they can utilize. The digitization of global business now requires administrators to have intelligent devices helping optimize and load-balance traffic all over the world. With virtualization, both physical and virtual appliances are capable of spanning global resources and communicating with each other in real time. From there, they can route users to the appropriate data center as a regular function of the policy – or even can route entire groups to other live data centers in case of an emergency.

  • Software-defined technologies. Working with software-based and virtual technologies certainly makes life easier. What we’re able to do now with a logical network controller in terms of creating thousands of virtual connections is pretty amazing. Furthermore, the ability to control traffic, QoS, and even deploy virtual security services makes these new types of technologies very valuable. Remember, the conversation isn’t just around SDN. Software-defined technologies also incorporate security, storage and other key data center components. We are creating logical layers which allow for improved communication between hardware components and global resources. This is the concept of virtualizing servers and services. Inter-linking nodes without having to deploy additional hardware is a big reason cloud computing and the boom in data center resources has become so much more prevalent.

  • High-density computing. Shared environments and multi-tenancy are becoming regular platforms within the modern data center. After all, why not? The ability to consolidate and logically place numerous users on one shared system is pretty efficient. Plus, administrators are able to use highly intelligent blade systems to quickly provision new workloads and rapidly repurpose entire chassis in case of an emergency. Furthermore, converged infrastructure is seeing even more advancement as more SSD and flash technologies become incorporated directly into the chassis. Imagine now having the capability to deliver millions of IOPS and hundreds of terabytes of flash storage to key workloads distributed over your corporate cloud. Furthermore, replicating this type of system requires less resource utilization and focus more on server profiles. This isn’t only a highly effective use of server technology – it’s the utilization of advanced service profiles which are capable of virtualization – at the hardware layer.
  • More bandwidth. More fiber, better local and external connectivity, and greatly improved network capabilities are allowing the data center to deliver massive amounts of data and lightening speeds. Already, we are seeing Google Fiber delivering unprecedented speeds to homes for very low prices. I, for one, can’t wait for that service to come to my city. The point is that bandwidth is become more available. Edge systems are capable of handling more traffic and very rich content. This increase in WAN-based resources is the direct reason that so many organizations are moving a part of their infrastructure into the cloud. Hybrid systems allow for fast data replication and the ability to stay up if your primary data center goes down. Furthermore, when you couple the above three drivers together, you get an environment which can replicate quickly and stay extremely agile.
Are there other advancements that have helped organizations achieve greater levels of redundancy? Of course there are. Everything from the software layer to new types of hardware advancements all help organizations better utilize resources. The bottom line is this – if you haven’t explored a cloud option for DRBC, maybe it’s time. The modern data center has become the home to a lot of really advanced technologies and service delivery models. All of these systems are working together to deliver more services, resources and a lot more agility for your organization.

Bagaimana CLOUD mengubah teknologi DATA CENTER anda?





One of the ways Microsoft supports its cloud servers is packing them in ITPAC modules which can be deployed quickly to expand capacity in any location (Photo: Microsoft)
One of the ways Microsoft supports its cloud servers is packing them in ITPAC modules which can be deployed quickly to expand capacity in any location (Photo: Microsoft)

How Cloud has Changed Data Center Technology

Let’s face it, if you’re a technologist and you’re reading this article, you’re tied to the cloud in one way or another. Whether you have Gmail synced to your phone or you upload photos to Dropbox, you’re utilizing cloud computing. Over the past few years, cloud services have become more enhanced and prevalent. There is more emphasis on data delivery, our ability to continuously stay connected and how our information is distributed.
Data centers and other technologies have had to adapt to this growing trends by deploying, in reality, some pretty cool technologies. This is happening at the IT consumerization level and within the data center:
  • The latest Cisco Global Cloud Index shows just how fast everything in the cloud is growing:Cloud Computing – We know about the cloud. We know that there are now four general models to work with (private, public, hybrid and community). The really amazing part that’s been happening is the open-source and cloud connectivity movement that’s been happening. People behind open source projects like OpenStack and CloudStack are creating powerful cloud API models to interweave various services and even platforms. The great part is that these technologies are still evolving and becoming better. Cloud APIs and connection models push the industry towards a more unified cloud architecture. Now, new concepts around software-defined technologies are helping push the cloud boundaries even further. Software-defined storage is a lot more than just a buzz term. It’s a way for organizations to manage heterogeneous storage environments under one logical layer. When convergence around network, storage and compute intersect with software-defined technologies – you create the building blocks for a commodity cloud data center.
  • Network Communications – This is where it gets really interesting. We’ve heard about software-defined networks, but the reality is that cloud-based networking has become pretty advanced. Cloud providers are deploying highly intelligent switching components which are capable of handling thousands of virtual connections. Furthermore, they’re able to present multiple networks to one another and still keep various services segmented. We are seeing more converged and unified systems where advanced networking capabilities are built directly into the rack, server, and storage infrastructure. Layer 4-7 switches are not only controlling traffic – they’re intelligently manipulating it based on various variables. These controls can revolve around geographic policies, connection points, and even device interrogation rules. This is also where we begin to include software-defined networking as powerful cloud data center concept. SDN can create very intelligent, globally connected, environments. Furthermore, SDN can help with load-balancing cloud and data center infrastructures. SDN already helps with global traffic management by logically sending traffic to the appropriate data center. Moving forward, SDN will strive to create even more fluid data center traffic flow automation. These types of efforts will help with downtime, data resiliency, and disaster recovery planning.
  • Disaster Recovery/Business Continuity – Emergency events can happen at any time – and for any reason. This is where the cloud has helped many organizations create solid disaster recovery or business continuity environments. Whether it’s an active site or a “pay-as-you-go” public cloud model; DR strategies are becoming more feasible for more organizations. A lot of this has to do with better global server load balancing (GSLB) and global traffic management (GTM) techniques. Our ability to route traffic based on numerous variables has empowered organizations to distribute there environments and their data. Not only does GSLB and GTM help by creating one logical network flow for data traffic and user access – administrators are able to keep users closer to their data centers. By identifying the user geo-location, cloud technologies are able to route users to the data center nearest to them. In the event of a failure, GTM and GSLB are able to immediately and transparently, route the users to the next available set data center resources.
  • Annual global cloud IP traffic will reach 5.3 zettabytes by the end of 2017. By 2017, global cloud IP traffic will reach 443 exabytes per month (up from 98 exabytes per month in 2012).
  • Global cloud IP traffic will increase nearly 4.5-fold over the next 5 years. Overall, cloud IP traffic will grow at a CAGR of 35 percent from 2012 to 2017.
  • Global cloud IP traffic will account for more than two-thirds of total data center traffic by 2017.
More is being stored within the cloud and more devices are connecting to the Internet. There are more services delivered via the web and entire organizations can go live without purchasing a single piece of equipment. As the reliance on cloud computing and the Internet continues to expand, the need for continuous innovation will help create even better platforms.

One Time Password dengan SendQuick ConeXa

Anda perlu membuat aplikasi yang bisa diakses dengan mudah dan aman?
Sekarang ini, aplikasi kebanyakan dibuat menggunakan platform berbasis web. Untuk menyediakan keamanan akses ke aplikasi, umumnya menggunakan beberapa pendekatan :
1. Menggunakan VPN. Staf / Pekerja yang berada di luar kantor / mobile diharuskan mengakses via VPN.
2. Menggunakan SSL. Aplikasi dipasang di server yang berbasis web menggunakan SSL (Secure Socket Layer).

Semua ini ternyata tidak menjamin keamanan akses aplikasi.
Salah satu cara yang banyak digunakan saat ini adalah menggunakan OTP (One Time Password). Password dibuat / diakses hanya pada saat staf / pekerja / user akan mengakses ke sistem, dengan mengirimkan informasi login melalui media SMS.

Solusi ini bisa dijawab dengan SendQuick ConeXa.

Silahkan kontak kami untuk keperluan informasi detail mengenai ConeXa.

sendQuick ConeXa

2 FA SMS OTP To Elevate Your Remote Secure Access
As staff and customers grow increasingly mobile and globalized, secure remote access is a key area of rising importance for many enterprises. Identifying the convenience and benefits of using SMS for secure remote access, TalariaX sendQuick has come up with a SMS appliance gateway for Authentication and Authorization – sendQuick ConeXa.
sendQuick ConeXa comes with a built in server that generates a One Time Password (OTP) delivered via Short Messaging Service (SMS) which provides an added level of security to work with any RADIUS based SSL VPNs. Users do not need to install any software and will be able to access information online securely via their mobile phones without the use of tokens for greater convenience. In addition, easy integration to local, external directory or Microsoft Active Directory (AD) as well as the ability to support multiple SSL VPN sessions positions sendQuick ConeXa as a hassle free and cost effective solution for companies. Updated patches of sendQuick ConeXa offers optimized messaging speed with increased stability.
TalariaX sendQuick® ConeXa is a plug-and-play appliance with key features including:
    – SMS One-time-password (OTP) for 2-factor anthentication
    – SMS password request for On-Demand SMS OTP feature
    – Built-in authentication and authorisation server
    – Easy integration with all RADIUS based SSL VPN as well as other RADIUS based systems
    – Customizable user message with configurable OTP expiry time (minutes)
    – Support for normal SMS, Flash and Message Overwrite SMS
    – Unlimited user license
    – Scalable to support up to 32 GSM modems
    – Options for RAID and High Availability (HA) for zero down-time implementation

Download sendQuick Conexa Brochure
 

Friday, October 17, 2014

Berapa sering Fire Pump harus di test ?




UNDERSTADING HOW OFTEN A fire pump needs to be tested in accordance with NFPA 25, Inspection, Testing, and Maintenance of Water-Based Fire Protection Systems, can be tricky. For example, many people assume that if they conduct the annual flow test for a fire pump that no additional testing is necessary. While conducting this test is vital to assuring that the pump still performs as intended when it was designed, conducting a single test each year does not provide a high level of reliability.
Another reason it can be difficult to keep track of when a pump needs to be tested is that the criteria that establishes the frequency for the no-flow test (also called a churn test or operating test) has changed in each of the last three editions of the standard. In the 2008 edition of NFPA 25, the standard simply required a weekly churn test and an annual flow test. This practice had been followed since the inception of NFPA 25 in 1992 and had seemingly been well-received, since there were no changes made to these frequencies for 15 years.
As electric-powered fire pumps became more prevalent, though, the technical committee began receiving questions addressing the need to conduct the operating test 52 times a year. During the development of the 2011 edition of the standard, the committee vigorously debated the idea of moving the frequency for the churn tests for electric driven pumps to a monthly activity. Following several certified amending motions at NFPA’s annual technical meeting, the move to a monthly churn test for electric-driven pumps was finalized for the 2011 edition.
The debate persisted, however, and spilled over into the development of the 2014 edition of the standard. The Fire Protection Research Foundation commissioned a study on the reliability of fire pumps in an attempt to determine if the new frequencies for operating tests were on point. This report, available under the “reports and proceedings” section of the Foundation’s website, nfpa.org/research, provided the technical committee with the information necessary to determine which pumps needed more frequent testing to confirm that they would function when called upon. After reviewing this data, the baseline requirements for diesel fire pumps did not change and the operating test must still be conducted weekly. The committee did add a provision that allowed for an alternate test frequency to be used when the revised frequency is supported by a risk analysis.
The requirements for electric-driven pumps were more substantially modified. Based on the data gathered from the Foundation project, it was determined that not all pumps, and more specifically the buildings that house them, would benefit from reducing the number of operating tests per year from 52 to 12. The technical committee determined that, either due to the risk associated with the building, or with the type or arrangement of the pump or controller, certain fire pump arrangements needed to be tested more frequently. As such, the 2014 edition of NFPA 25 requires electric-driven pumps in high-rise buildings, electric-driven vertical turbine pumps, and pumps using limited service controllers to undergo weekly operating tests. For all other electric-driven fire pumps, a monthly operating test is still appropriate. The risk analysis that can be conducted for diesel-driven pumps also applies to electric-driven pumps.
Depending on which edition your jurisdiction adopts, the frequencies for conducting these tests may vary. The 2014 edition, though, contains the latest information from the technical committee. 

Wednesday, October 15, 2014

SOLUSIKAMI: Mengelola dokumen di perusahaan konstruksi / kontraktor

Beberapa kalidalam pengalaman kami berinteraksi dengan perusahaan kontraktor, baik berskala besar hingga kecil, seringkali mengalami kesulitan untuk melakukan pendataan, trace hingga pengelolaan dokumen.

Sedangkan dokumen yang mereka kelola ini semuanya adalah dokumen berharga. Mulai dari desain, kelengkapan material, gambar hingga foto dan dokumen terkait proyek.

Semua ini mengingatkan saya, pada salah satu pertemuan dengan rekanan di bilangan Jakarta Selatan. Dia mengeluhkan semua datanya, dokumen tepatnya, dibawa kabur oleh orang kepercayaannya, yang telah bekerja untuknya cukup lama. Semua dokumen berharga ada di dalam harddisk yang dishare dan diakses bersama.

Sekarang sudah saatnya kita gunakan Manajemen Dokumen, atau dikenal juga dengan Document Management System. Dengan menggunakan DMS, maka akses atas dokumen, pengelolaan dan penyimpanan menjadi sangat mudah.

Dokumen baik berupa gambar, file bahkan video dapat dikelola dengan mudah dengan menggunakan DMS. Salah satu produk DMS yang kami bawa adalah SOHODOX dan GLOBODOX.

Silahkan mencoba kemampuan software ini di :http://www.sohodox.com/download

Kontak kami apabila ada kendala dan perlu bantuan teknis serta penawaran harga terkait solusi ini.


Discount Khusus untuk produk SOHODOX bagi perusahaan KONSTRUKSI



The best way for any small business to organize their digital documents is by using an eDMS like SOHODOX. You can organize your digital documents in a manner of your choosing, find them quickly and never lose track of any document. If you work in the construction industry, now is the time to get SOHODOX since there is a 25% off for professionals and businesses involved in the construction industry.

Working in the Construction industry means that you are always flooded with paper. Managing all those permits, blueprints, invoices, etc. in hard copy can be really tedious. Simply digitizing them still doesn't help you find the right document at the right time. Using folders in Windows to store and access your digital documents is slow and inefficient.

SOHODOX knows. And that's why SOHODOX contains all the features needed for your construction office - with tags, powerful search features, ability to link various documents and auto capture from Windows folders. It's a simple tool to help you save time and ensure that you never lose another document. For a limited period only, SOHODOX is offering a 25% discount on for anyone in the construction industry. Whether you are an accountant, contractor or manager, if you work for a Construction company, you're eligible for the discount.


WatchGuard Earns NSS Recommended Rating



WatchGuard Earns NSS Recommended Rating

NSS Security Value Map
Our XTM 1525 is a top-rated Next Generation Firewall, based on a real-world comparative analysis by NSS Labs. Designed for the enterprise, the XTM 1525 delivers the best combination of both security effectiveness and value for cost per protected Mbps, according to NSS.
NSS test results impact buying decisions in the marketplace, and the "NSS Recommended" label that WatchGuard carries is important validation. Be sure to share our impressive score, along with a copy of the NGFW Security Value Map with your customers and prospects.
NSS test results
The WatchGuard product blocked 96.7 percent of attacks against server applications, 98.7 percent of attacks against client applications, and 97.8 percent overall. It proved effective against all evasion techniques tested and passed all the stability and reliability tests, including all application control, identity awareness and firewall policy enforcement test.
Learn more about this great NSS win and the XTM 1525.

Sunday, October 12, 2014

Kirim SMS dari email Gmail dengan OZEKI



Salah satu fitur yang menarik, terkait dengan email to sms, adalah menggunakan email Google ke sms.

How to send Email to SMS from Google Mail (Gmail)

Here step by step instructions are given to configure Gmail to forward the incoming emails to Ozeki NG E-mail user. Also you can learn how to receive email notifications from your incoming SMS messages. Ozeki NG SMS Gateway offers you an outstanding technology.
This solution will improve your corporate communication to a great extent. By forwarding your emails as SMS messages you can be sure that all the recipients will get your messages. This technology has great potentials, it can help you to arrange your meetings in no time. Also you can keep in touch with your customers, business partners, staff, and even family and friends.

How this solution works

Ozeki NG SMS Gateway has a great function which is called SMS Gateway forwarding. With the help of this function you will be able to send your email messages as SMS messages from your Gmail account to mobile phones. In order to use this option you need to create an Email user in the SMS Gateway. This Email user will periodically download incoming email messages from the Gmail POP3 account. When you receive an incoming email there is an incoming email message, Ozeki NG SMS Gateway will also send an SMS notification about this incoming email. In this SMS message you will see the sender and the subject line of the email message.

Figure 1 - System architecture
It is also possible to configure your system to be able to receive SMS messages in your Gmail account. In this case the SMS message arrives at Ozeki NG SMS Gateway and it is forwarded to the Gmail account as an email message.

There are two ways how the SMS gateway forwards messages to the mobile network. Either a GSM modem is used for this purpose that is attached to the PC with a datacable or the software connects directly to the SMSC (SMS Center) of the mobile service provider via the Internet.

Gmail to SMS configuration

Log into your Gmail account. Firstly you need to click on Settings menu point. Secondly selectForwarding and POP menu item. Thirdly in POP Download you need to click on Enable POP for all mail. Fourthly click on Save Changes (Figure 2).

Figure 2 - Google Mail - Settings/Forwarding and POP/Enable POP for all mail

Ozeki NG - POP3 settings in the E-mail user

Provide the following data: (Figure 3).
  • POP3 Server: pop.gmail.com
  • Port: 995
  • POP3 Username: your login name
  • POP3 Password: your password
  • SSL: enable
Check the SSL box.

Figure 3 - POP3 settings in the E-mail User of the Ozeki NG

SMS to Gmail Configuration

If an SMS message arrives in the Ozeki NG, it can be forwarded to your Gmail e-mail address.

Ozeki NG - SMTP settings in the E-mail user

Provide the following data (Figure 4):
  • Sender e-mail address: ozekingsms@gmail.com
  • SMTP Server: smtp.gmail.com
  • SMTP Port: 25
    My SMTP server requires authentication.
  • SMTP Username: your login name
  • SMTP Password: your password
Use this e-mail address as the sender address:
My SMTP server requires SSL connection.

Figure 4 - SMTP server settings in the SMS to E-mail tab

In conclusion...

If you are looking for a stable and long-term solution that improves your corporate communication, Ozeki NG SMS Gateway is your choice. With the help of this outstanding software you can build your own SMS system. An own SMS system can save time and money, and it makes your business more effective and prosperous.
Ozeki NG SMS Gateway can be obtained by
opening the download page:
Download Ozeki NG SMS Gateway!
Thank you for reading this guide!

10 Teknologi Strategis di 2015


For organizations looking to stay on top of the latest technology trends, Gartner has released its annual list of the top 10 strategic technology trends they say organizations should keep their eye on.
“We have identified the top 10 technology trends that organizations cannot afford to ignore in their strategic planning processes,” said David Cearley, vice president & Gartner Fellow, in the company’s announcement. “This does not necessarily mean adoption and investment in all of the trends at the same rate, but companies should look to make deliberate decisions about them during the next two years.”
A strategic technology trend is one that has the biggest potential to significantly impact individual, businesses and IT organizations over the next three years, according to Gartner. The top trends for 2015 cover the merging of real and virtual worlds, the introduction of intelligence everywhere and the technological impact of the digital business shift, according to Cearley.
Gartner’s top 10 strategic technology trends are:
Computing everywhere: A majority of consumers cannot live without their mobile devices, and as mobile devices continue to grow Gartner predicts organizations will need to focus on diverse context and environments, as opposed to just the device.
“Phones and wearable devices are now part of an expanded computing environment that includes such things as consumer electronics and connected screens in the workplace and public space,” said Mr. Cearley. “Increasingly, it’s the overall environment that will need to adapt to the requirements of the mobile user. This will continue to raise significant management challenges for IT organizations as they lose control of user endpoint devices. It will also require increased attention to user experience design.”
The Internet of Things: The Internet has played a big role in today’s modern world, but as the Internet of Things continues to proliferate, the Internet’s role is expanding to a diverse range of devices and communication streams. Gartner predicts it will only continue to grow.
3D Printing: While 3D printing has been around for a while, it’s finally starting to gain some real momentum. Gartner predicts the worldwide shipments of 3D printers will grow 98% in 2015, with that number doubling in 2016. According to Gartner, 3D printing is a real, viable and cost effective solution that can help organizations improve designs, streamline prototyping and short-run manufacturing.
Advanced, Pervasive and Invisible Analytics: “Every app now needs to be an analytic app,” said Cearley. “Organizations need to manage how best to filter the huge amounts of data coming from the IoT, social media and wearable devices, and then deliver exactly the right information to the right person, at the right time. Analytics will become deeply, but invisibly embedded everywhere.”
Context-Rich Systems: Gartner says applications that are able to understand its users, are aware of their surroundings and can respond appropriately based on the context are on the rise. Gartner predicts these apps could simplify the ever-increasingly complex computing world.
Smart Machines: According to Gartner, the smart machine era will be the most disruptive in the history of IT. Virtual personal assistants, smart advisors, advanced robots and autonomous vehicles already exist, and Gartner says as smart machines continue to evolve we can expect a new age of machine helpers.
Cloud/Client Computing: “Cloud is the new style of elastically scalable, self-service computing, and both internal applications and external applications will be built on this new style,” said Cearley. “While network and bandwidth costs may continue to favor apps that use the intelligence and storage of the client device effectively, coordination and management will be based in the cloud.”
Software-defined applications and infrastructure: In order to keep up with changing demands of digital business, Gartner says computing needs to move away from static to dynamic models to scale systems up or down.
Web-scale IT: Gartner sees the emergence of more organizations acting, thinking and building apps and infrastructure similar to Web giants such as Amazon, Google and Facebook. Gartner notes that the first step toward a Web-scale IT future should be DevOps—the marriage between the development and operations teams.
Risk-based security and self-protection: Last but not least, Gartner concludes that all roads to the digital future lead through security. Once organizations realize it’s impossible to provide a 100% secured environment, they can begin to apply more sophisticated risk assessment and mitigation tools. Gartner believes every app will need to be self-aware and self-protecting.

About Christina Mulligan

Christina is the Online & Social Media Editor of SD Times. She is a 2012 graduate of Stony Brook University’s School of Journalism, graduating with a Bachelor's degree in broadcast journalism and a concentration in public affairs. She has interned at WNET Metrofocus, WABC Eyewitness News and Newsday.

12 karakteristik pengembangan aplikasi modern




We are witnessing a transformation in application development tools and techniques that is changing how enterprise software is being constructed and deployed. This transformation is the broadening of mobile application development approaches initially instigated by the mobile revolution (yes, the iPhone changed everything) to embrace initiatives for back-end Internet of Things architectures and even employee apps.
While the consumer app economy embraced these new tools and techniques rapidly in the last few years, enterprises have moved much more slowly. In the last couple of years however, enterprises have begun to get on the bandwagon.
To be sure, most enterprises will be running, maintaining, even enhancing old applications for a long time to come. IDC research over the years consistently shows that some 75% to 85% of development resources are used in maintaining existing applications. But for new projects and initiatives, it is becoming clear that we are in a different era, and enterprises are now mobilizing to leverage this new method of application development.
Here are 12 attributes that characterize the new mode of application development beginning to make inroads in enterprises:
  1. Mobile: Supporting touch interaction and adaptability to a large range of screen sizes and pixel densities. Factoring application functions into simpler, shorter workflows. Factoring applications into many apps. Factoring systems into APIs and apps.
  2. Cloud-Backed: Project assets stored in the cloud catalyze collaboration between stakeholders. Using cloud resources for testing. Using MBaaS services such as authentication and notification. Avoiding hardware provisioning. Adopting more abstract and machine-independent application models for back ends in the form of PaaS.
  3. Agile: Incremental, frequent releases. Rapid response to change. Teams working in an integrated fashion. Collaboration with social tools, and integration of users into the development process, including frequent user validation.
  4. Continuously Integrated and Delivered: Applications must be integrated to run daily. Heavy reliance on automated testing. Integration of testing into the development cycle, and frequent and incremental changes to application users, potentially while preserving a managed degree of generational compatibility.
  5. DevOps-Enabled: Developers own deployment or work seamlessly with Ops staff to release, test, refine and rerelease applications to users. Using self-service portals to enable many stakeholders to participate and have accountability for the application. A view of the application life cycle that integrates its performance through delivery to the end user.
  6. App Store Delivered and Extended: The app is delivered in an app store and/or uses modules or off-the-shelf components and services. Components and extensions are surfaced in module or app directories where they can be discovered and integrated.
  7. Analytics Infused: Developers get rich intelligence on application usage. Data on app modules and screen elements are used. Fielding experiments and tests with automation and quick turnaround (e.g. A/B testing of planned features).
  8. User Experience-Centric: Focusing on the design and appearance of the application that brings the front-end designer and/or application programmer into the enterprise development process more fully than ever before.
  9. Socially Oriented: Integrating of user-interface patterns borrowed from social networks, such as timelines, event streams, social graphs and other social metadata. Data is updated using event-based push-oriented patterns. Integrated search functionality. Seamless support for content elements such as images and video. Integration with consumer or enterprise social networks.
  10. API Factored and Surfaced: The move to APIs involves a comprehensive movement to componentize and granularize back-end software in order to achieve composable and easy-to-evolve back-end platforms. Today’s API architectures use standardized, loosely coupled and lightweight REST call formats to reduce complexity and maximize accessibility.
  11. Lightweight: Less complex software that is less time-consuming to install, learn and use. Software that uses a smaller resource footprint—including on disk, memory and CPU—and enjoys quick startup and recycling times. The trend toward lightweight can be witnessed in both the tools sphere, such as the increased use of editors or online IDEs, and in deployment architectures, such as the use of Web Servers in place of App servers, the use of NoSQL in place of relational databases, and the use of containers (e.g. Docker) in place of virtual machines.
  12. Model-Driven: Rich use of visual tools to support abstraction in the development toolset such as for relations in a data model, business logic flows, and process flows. Some tools rely entirely on such approaches to construct applications, while others use them selectively in domain-specific ways for building various parts of the application such as the user interface or process workflow.
These changes in the aggregate are as influential in evolving the state of enterprise applications as object orientation or distributed computing. While not all projects have all 12 ingredients, enterprises that can ascribe four or five of them to their new application efforts are definitely engaging in modern application development. IDC believes that some 60% to 80% of new application development initiatives support at least four of these characteristics.

About Al Hilwa

Al Hilwa is an industry analyst at research firm IDC, specializing in application development research. He has written columns for tech publications and is widely quoted in the media, including the Financial Times, The Wall Street Journal, The New York Times, and National Public Radio. Prior to IDC, he spent seven years at Microsoft in the Server & Tools division, and five years as an industry analyst at Gartner specializing in database systems. Before Gartner, he worked in IT in various industries. He holds an MS in Computer Science and a BA in Mathematics and Computer Studies. You can follow him on Twitter at @AlHilwa.
- See more at: http://sdtimes.com/analyst-watch-12-characteristics-modern-application-development/#sthash.JyuEjXSQ.dpuf