View report
Taking Stock
An endpoint manageability state of the nation
A Computing content hub, in association with Intel
Hardware strategy hub: Maximising your cloud and on-prem potential
Find out more
Developed for the IT professionals of today and tomorrow, Intel vPro is built for business. The manageability technologies allow IT to keep a highly dispersed workforce patched and protected, whilst the stability features allow for consistent rollouts and reliable lifecycle management. So, no matter where your users find their office, managing a fleet is made easier. With each component and technology designed for professional grade, IT can be confident with tools to enhance everyone’s productivity and help secure their business’ data.
Upskilling for AI success
View article
The importance of hardware for building strong AI foundations
Taking AI to ROI
Why your organisation’s human and technology cyber security challenges are inextricably linked
ARTICLES
Modern manageability
Taking charge of your increasingly remote, diverse, and numerous endpoint devices
Behind the keyboard
IT leaders reveal how they are meeting their human workforce challenges with modern technology solutions
The hardware platform in 2021
How the enterprise endpoint needs of IT leaders are changing
Coming soon
Why your endpoint device strategy should be driven by user experience
research REPORTS
The right workloads in the right places
This hub, in collaboration with Intel, cuts through the hyperbole to uncover how organisations are achieving ROI from their initiatives, why on-premises computing is still important in the cloud era, and how organisations are moving from initial concept to business success. Computing, in association with Intel, brings you a series of research reports, webinars, and articles, delving into the experiences and expectations of today’s IT decision-makers. It will explore the challenges and achievements of organisations when it comes to AI strategy, where to host workloads, cloud success, and other datacentre issues.
Organisations are under pressure to develop capable hybrid computing environments then deploy digital capabilities wherever they’re best suited and most cost effective. They must also innovate to remain competitive, all while coping with security risks, skills shortages, and environmental concerns. How are IT leaders approaching their cloud and datacentre strategies today?
Is ‘Problem in Chair, Not in Computer’ a cyber security misnomer?
No PICNIC
An eye for AI
Real-world use cases with real results
Accelerate your HPC strategy
From standing start to peak performance
Computing webinar:
Michael Galarnyk from Cnvrg and Stephan Gillich, Director of AI and Technical Computing GTM at Intel discuss AI use-cases alongside dedicated research
VIDEO
Stuart Sumner, Editorial Director at Computing, speaks with New Look CTO Ed Alford about managing endpoints today
Remote endpoint management
INTERVIEW
12:37
How to move beyond AI proof of concept
Real world AI success stories
Are we making the most of AI?
real world use cases with real results: how organisations are moving from proof-of-concept to revenue-generating success
Computing editor Tom Allen hears from interim CTIO at StepChange Joanna Smith about endpoint buying in the public and private sectors
Endpoint buying process
15:37
44:33
Visit the newsroom
Learn more
Find out more about Intel
Intel (Nasdaq: INTC) is an industry leader, creating world-changing technology that enables global progress and enriches lives. Inspired by Moore’s Law, we continuously work to advance the design and manufacturing of semiconductors to help address our customers’ greatest challenges. By embedding intelligence in the cloud, network, edge and every kind of computing device, we unleash the potential of data to transform business and society for the better. To learn more about Intel’s innovations, go to newsroom.intel.com and intel.com.
About the sponsor – Intel
On-prem, in control:
Why some workloads are still at home in your data centre
The importance of optimising on-premises infrastructure to support mission-critical workloads
Accelerators powering HPC success
Hype versus reality: Why some organisations are opting to decloud
To learn more about Computing’s research into real world AI use cases with real results.
FURTHER READING
Read the full report
Understanding the data your business your collects is crucial to its success. Acting on insights and transforming historical interactions into predictive analysis are key to successful business models. AI – a popular buzzword among industry professionals and employees, still has a long way to go before it is in widespread use and organisations are maximising the value of their data.
The state of play
Computing’s latest research into this space, conducted in collaboration with Intel®, finds AI technology is extremely attractive, with 98 per cent of organisations surveyed saying they are at least interested in employing AI or machine learning. However, just 7 per cent could say they had fully implemented the technology. A slim majority, around a third of respondents, are still in the trialling/incubating stage of adoption, establishing AI frameworks mainly for data analytics and natural language processing. Many factors contribute to relatively slow adoption. Talent shortages, economic uncertainty, and favouring other digital programmes due to time-consuming day-to-day IT management are all pervasive problems. However, while organisations evidently have a long way to go before they reach AI maturity, it does not mean they cannot experience the gains from lower hanging fruit. For those at the beginning of their AI journey, data analytics is a natural home for leveraging AI’s potential to spot patterns and learn over time. The savings on cost, time, and oversight are huge gains reported by organisations utilising AI. It allows organisations to make decisions confidently, backed by intelligent digestible information.
Ultimately, maturity denotes an organisation’s ability to fully achieve and scale impact from their AI systems. Developing the appropriate data and technology infrastructure, while educating teams and fostering an innovative environment, ensures benefits are fully realised. Organisations that are reluctant or have stalled in their AI journey should focus on the business outcomes and the most cost effective and easy wins – drawing on partner support where needed. Around a third of respondents in the AI trialling stage report the technology is already having a significant positive impact on their organisation, with 63 per cent of those in the midst of implementation similarly recognising substantial improvements. Strikingly, roughly 90 per cent of organisations fully utilising AI state it is having a considerable positive influence. Demonstrably, AI transformation affords early successes and with that comes increased faith in its value. While maturity is not widespread, interest is huge, benefits are being seen, and most organisations have at least begun to embark on their AI journey.
Taking the next step
ARTICLE
Organisations must harness AI to extract value from data, but challenges abound. Data pre-processing, from discovery to breaking down silos, to quality control, and managing it from edge to cloud, come first. Taking the right approach to modelling, from analytics to machine or deep learning, with the right technology and expertise comes next. Intel provides a holistic and open path forward, addressing the full data, modelling and deployment pipeline, with the freedom to compute on whichever architecture is best, including the only x86 CPU with built-in AI acceleration.
Sponsor insight - Intel
The benefits of AI in enterprise environments are widely recognised. From intelligent reporting to proactive cybersecurity, and from customer insights to automating human-intensive processes, the drivers of success enabled by AI are well-known. However, how can organisations cut through the hyperbole to uncover the cases for genuine business success? Computing’s latest report into this area, conducted in collaboration with Intel®, uncovers how organisations have gone beyond the proof-of-concept and facilitated success using AI. For example, AI in healthcare is positively impacting patient care. Routine X-Rays, supplemented with AI, speed up diagnoses, improve scanner productivity, and improve the patient experience. The speed and accuracy of imaging are significantly improved through intelligent tooling that has been trained on a plethora of historical data. In this way, AI does not replace healthcare employees but instead enhances the capabilities of users and automates mundane manual objectives. Through assisting with repetitive tasks usually performed by specialist workers that create inattention, AI allows organisations to redirect their resources while preventing user error and fatigue. For example, rotation, a laborious but essential component of reviewing medical scans, is automated using AI. Reports estimate that removing the human element from rotation saves employees around three working days per year.
Intelligent insights
AI, capable of intelligently tracking user behaviour and decisions, is also benefitting customer experience. Data obtained on user interactions can be used to create straightforward, actionable insights for businesses. Based on this, AI can make automatic recommendations to customers that is specific to their previous and real-time preferences. Alongside contextual data such as time of year or even outside weather, organisations like restaurants can provide tailored services that, for example, recommend soup on a cold day to a soup-loving customer and recommend meal packages based on what is already in their basket. Performance enhancements as a result of AI also improve user interactions, enabling quick, reliable, and user-friendly experiences. For customers, a positive interaction is hugely important in attracting their business and facilitating future business. While progress in AI adoption is uneven, the advancements are clear. Roughly 1 in 2 organisations have begun their AI transformation journey, either trialling, implementing, or having fully implemented the technology. Those organisations underline the significant role AI has played in enabling new lines of business and revenue streams, with around 80 per cent of respondents agreeing is has positively impacted their competitive edge.
Across industries, organisations are looking to use AI to enhance decision-making, automate processes, and gain crucial insights. However, adoption still has a long way to go, with many only just beginning to embark on their journey to establishing a culture led by advanced data analytics. Computing’s latest report in this space, carried out in collaboration with Intel®, finds that fewer than one in ten organisations are mature enough to operationalise and scale AI. Roughly 80 per cent of respondents agree their organisation is in the early stages of adoption, suggesting they are encountering problems with moving from proof of concept to revenue-generating success. When asked how their organisation’s technology strategy will change in the next few years, Computing respondents overwhelmingly highlight AI. Plans to increase its use by automating more processes to save customer and employee time are seen as a key advantage in implementation. They see AI as an opportunity but recognise the importance of identifying current processes and how they can be improved using the technology. Moving beyond proof of concept is dependent on a strategic rollout, comprising all elements of an organisation across data, technology, people, and governance. Careful planning punctuated with relevant measurable performance metrics, is the key to success with the least cost and risk to an organisation.
Data and technology
Examining what data is required to support specific AI initiatives as well as how existing or needed infrastructure and tools can train, deliver, and manage AI should be top of mind. AI implementations are only as effective as the data they are fed with. Most organisations utilise some form of data analytics to summarise and visualise historical data. Some will have reporting tools that enable detailed queries amidst trend and causation identification. Organisations looking to achieve their AI strategy ambitions must examine their current data set capabilities, integration, and hygiene to see how they can be used to train AI models and generate business benefits.
In addition, having the right leadership practices and roles to establish and nurture a culture driven by AI is important. Policies and processes should be informed by how specific organisations operate. Skills shortages are pervasive in technology roles, meaning organisations must review their in-house capacity to achieve business ambitions. Does your organisation have skills gaps? How can you fill those gaps? Do you have organisational buy-in for implementing AI? How can you increase awareness for AI initiatives? These are important questions that must be answered before beginning an AI journey. For many, answers to these questions will lie in adopting technology that democratises data and AI by offering user-friendly UX and requires little development or data analytics expertise.
People and governance
AI continues to garner interest across industries, demonstrating clear benefits to cost savings, decision-making speed, and customer insights. However, fully realising benefits in a measurable way can prove difficult. Unsurprisingly, a return on investment (ROI) is top of mind for organisations and is vital to securing budget for further AI initiatives. Computing’s latest research on AI real world use, in collaboration with Intel®, delves into how IT leaders are responding to this challenge. Organisations that are implementing AI overwhelmingly agree it has already had a positive impact. For those that have fully deployed an AI solution, 88 per cent report it has had a significant influence on their organisation and 100 per cent say they have seen a return on their investment. When asked where the most benefit is being seen for their data analytics and AI initiatives, IT leaders reported on gains from improved management of risk, enhanced customer insights, and the ability to meet customer needs.
Customer experience and intelligence gains
Intelligent reporting that builds from historical data enables new marketing approaches for organisations. Customer insights, for example their preferences and purchasing behaviour, aids understanding of habits in a way that can produce actionable suggestions – driving long-term returns. For example, retail banks making use of AI in Know Your Customer (KYC) processes are experiencing huge gains in efficiency as well as improved regulatory compliance. Dealing with documentation and information from account holders and applicants, data can become fragmented and inconsistent. With intelligent tooling, clients are more satisfied, privacy and confidentiality are reliably secure, and the cost involved in retaining business is greatly reduced without human intervention. Digitising KYC reduces the need for customers to physically visit a branch – an attractive approach in an increasingly remote world. Accelerating KYC with AI means banks can lower the cost of compliance checks and save money on internal processes, altogether quickly delivering a tangible return on their investment. For those that have fully implemented AI at their organisation, 100 per cent have seen an ROI. The most common use cases are within data analytics, deep learning, and natural language processing. When asked to rate the current success of all AI use cases for their organisation, the most common are rated the highest, averaging an approximate score of 8 out of 10. This demonstrates the reliability and performance of applying AI initiatives. Budget constraints are a pervasive issue for organisations today, not least as we enter a period of economic challenge, but real-world AI use-cases are providing cost savings. Legacy technologies are a growing expense so investing in developing and managing flexible infrastructure capable of unlocking data value will significantly improve ROIs. Cutting through the hype surrounding AI and establishing genuine business value from its use can be difficult. Success depends on strategy, culture, and using the technology in places it best suits. The gains to be made from successful AI implementation all have downstream effects on cost. Intelligent reporting facilitates actionable insights, automating mundane tasks redistributes vital resources, and accurate, reliable processes enables confident decision-making that’ll guarantee ROI.
A successful, cost-effective AI journey is built on high performance hardware. Analysing vast amounts of data quickly and accurately, a key mechanism in AI, relies on specialist technology with capable compute features. Including highly threaded workloads that demand numerous cores, high bandwidth memory, and AI-specific instruction sets – all while keeping a lid on power demands. Computing’s recent research into this topic, conducted in collaboration with Intel®, finds hardware specification is of high value to IT leaders across industries. Around 60 per cent of survey respondents agree hardware is vital to effective AI and analytics workflows with a third strongly agreeing. IT decision-makers recognise that efficiency and advanced analytics capabilities depend on memory management and parallelisable computer power. The right infrastructure is therefore paramount in enabling access to that data in a way that allows queries and actions to be carried out. Whether workloads are carried out on premises or in the cloud, memory, compute, and networking infrastructure must be architected to complement the design and deployment of AI solutions. Hardware directly impacts that speed of performance and throughput, meaning optimised infrastructure will ensure AI implementations don’t act as a bottleneck on business operations, and instead accelerate them.
Lean and green
The development of computing architecture that runs efficiently, with low power consumption is of increasing concern. Data centre energy and water use is under increasing scrutiny as organisations grapple with mounting costs and greater regulatory pressure in the face of environmental concerns. It is all ever more important to review carbon footprints and invest in sustainable infrastructure. Inefficient processors or supporting hardware platforms will not only create problems for organisational productivity but further prevent innovation for processes such as AI. Companies should look to invest in hardware that has been designed and optimised for AI workloads. To reap its full value, it is important to determine where AI technologies can be used for their organisation specifically, based on business outcomes. Leveraging the right technology and building digital initiatives on dependable infrastructure are key drivers of success. AI is performance oriented, and any system is only as fast as the slowest component. Without capable hardware in place, you’re falling at the first hurdle.
AI advancements transform organisations across every industry to make more intelligent business decisions, faster and with greater confidence. However, even intelligent solutions will not succeed if employees are not motivated to use them. Amidst the long-standing problems surrounding talent acquisition and retention, organisations must strive to scale with the talent they have and build a culture driven by advanced data analytics. Computing’s latest research report into real-world AI adoption, conducted in collaboration with Intel®, explores AI’s impact on today’s working environments. IT leader survey respondents emphasised the performance enhancements gained from implementing AI and how it enables quick, reliable, user-friendly interactions for both their customers and employees. However, AI maturity is lacking across many industries, with just 7 per cent of those surveyed reporting they have fully implemented AI at their organisation. Roughly a third are in the trialling stage, suggesting adoption is primitive or delayed. Yet, 96 per cent are at least interested in employing AI, indicating there is headway to be made in this space.
AI accessibility is dependent on sharing knowledge and developing literacy from within – whilst drawing on the right third-party partners. Sourcing and supporting the right expertise will ensure the best vision, roadmap, and decisions can be made – and then see those objectives born out on the shop floor. In most enterprise environments, AI-enhanced capabilities are not accessible to the average person. Often, its use is confined to specialists developing, deploying, and training models. However, employees should be aware of how AI can aid their role or tasks and armed with the tools to employ it day-to-day. This is part of an increasing drive towards data-democratisation in the enterprise space. IT leaders must ask what new talent is needed and on what basis? What partners and outside help should be used to facilitate this? How can we upskill and reskill our workforce?
AI-augmented
People may be wary of AI, especially how its use may impact their role. Users will not necessarily be replaced by the technology, but instead better supported, and dedicated to issues requiring their specialist knowledge and creativity. AI should complement existing ways of working. In this way, users can be more focussed on their objectives rather than wrangling with the technology they’re using or becoming fatigued from monotonous, mundane tasks. IT leaders are starting to strive for a more data-driven decision-making culture – one that is increasingly underpinned by AI. A successful AI strategy is built on skills and talent.
It is difficult to find an organisation that does not store at least some of its assets in the cloud. However, that does not mean that on-premises computing is a thing of the past. In fact, it still has an important role to play, and as such must be kept up-to-date. According to research by Computing in which 131 IT leaders were surveyed, only 2 per cent of organisations have fully migrated to the cloud, with the rest taking a hybrid approach, combining cloud with on-premises. In fact, 36 per cent still mainly store their workloads on-premises and 66 per cent agree that on-premises computing still has a crucial role to play at their organisation. While cloud computing offers many benefits, such as scalability and flexibility, and is particularly suited to smaller organisations without the resources to manage their data storage in-house, that does not mean it is the best option for all workloads.
The pros of on-prem
Server reliability, availability, serviceability, and low latency are essential for mission critical applications such as ERP, HCM and databases. Moving these to the cloud may risk costly downtime. Some legacy applications have also been designed in a way that doesn’t allow data to be migrated to the cloud easily or depend on important customisations, making a wholesale migration unsuitable due to the cost and complexity involved. Furthermore, some regulations may require data to remain on-premises where organisations have greater control over who can access it. While cloud computing has had the spotlight for some time now and organisations aspire to be “cloud first” or “cloud only”, it is clear that some workloads are better suited to on-premises and, for some organisations, on-premises remains the safest and most cost-effective choice. It is therefore important that it is not overlooked in digital transformation strategies.
Updating infrastructure
However, just as cloud migration alone is not enough to achieve business transformation, keeping workloads on-premises without regularly assessing whether they are performing at their best may mean organisations are missing out. Sixty-two per cent of respondents say their organisation has modernised its on-premises hardware to keep pace with innovation. The fact that a third have not done so is concerning, and implies they are suffering poorer performance, reliability, energy efficiency and security. Without embracing up-to-date technology, organisations may get left behind or outcompeted. With on-premises computing still having an important role to play, it is important that ageing datacentre hardware does not stand in the way of organisations operating flexibly and with agility. Seventy-five per cent agree high-performance reliable hardware is crucial to on-prem success, with just 5 per cent disagreeing. It is vital that processors and supporting hardware platforms are efficient across all workloads. Without strong infrastructure in place, the benefits of keeping certain workloads on-premises will not be fully realised. Beneath every computing workload, from edge to cloud, is hardware. Whether or not that environment fulfils its business objectives is dependent on performant, secure, and cost-effective hardware. How a data centre is structured and managed, how equipment is deployed, data hygiene as well as how often hardware is refreshed all contribute to this and should be regularly assessed to see if improvements can be made. Organisations must regularly audit their on-premises infrastructure to determine whether they are getting the most for their data, whether they are benefitting from technological advances, and whether any updates are needed. By having a modernisation strategy in place for their IT infrastructure, organisations will maximise their investments, ensuring they can implement new ways of working without being slowed down by legacy hardware, and get the most of both cloud and on-premises.
To learn more about why on-prem computing is still important for today’s organisations.
Learn more about Computing’s latest research into the endpoint manageability state of the nation
Employees now rarely set foot on-premises and self-determine when they are at their desks. Therefore, remotely managing endpoint estates regardless of power or OS state, or location is essential to the modern workforce. Organisations must onboard new workers, update existing endpoints, and patch problems anywhere, anytime. Computing’s latest research on this subject, conducted in partnership with Intel, highlights the chief endpoint manageability concerns for IT leaders today. 150 IT decision makers involved in endpoint strategy or implementation at their organisation, from a range of industries, were surveyed. 100 per cent of survey respondents saw endpoint estate management demands increase or stay the same over the last two years. However, less than a quarter say they had an advanced, reliable remote management technology in place pre-pandemic, and under half say they have improved their processes.
Distance difficulties
Computing survey respondents voiced the challenges:
New ways of working have meant a proliferation of devices. IT teams have to keep up with the demands, and for many, issues have occurred. A selection of responses that reflect the most common sentiments follows.
“All our staff are now working from remote endpoint devices which must be managed within a mobile device platform.” “People working from home are no longer using the corporate internet gateway. Workers are not regularly connecting to the network, creating difficulties with updating patches and new configurations.” “Bandwidth restrictions and usage of remote devices has been an issue.” “The increased usage of multiple devices from home caused performance issues on endpoint infrastructure.” “Endpoints being out of date and working in remote areas are not able to connect back in and receive updates. Visibility is a real problem.”
“We had to accelerate our plans and rapidly roll out additional devices. Time was a challenge and demand for better endpoint security greatly increased.” “We have a lack of resources to support the increase in remote devices. There’s insufficient staff, knowledge, and skills internally.” “There are issues with the availability of IT staff to monitor status of endpoints and their capacity to respond to alerts.” “Physical control of assets, off network management, and overall visibility are major challenges.”
Skills shortages
Advances in digital capabilities have supported the seismic shift to remote working, but digital acceleration of this kind has placed greater responsibility on IT teams, creating workload and skills problems.
“BYOD is difficult. There are human errors and clashes between BYOD support and endpoint security management. User awareness, education, and errors are a challenge.” “It’s difficult to get end users to accept policies, procedures, and tools now they are remote.”
People problems
As users become increasingly dispersed, it becomes more difficult for IT teams to provide support. Making sure users are remaining vigilant and aware is also tricky.
Across the board, endpoint manageability struggles are taking their toll. Only 7 per cent of respondents are extremely confident in their current endpoint processes. As devices become increasingly dispersed, they must be secure, stable, and efficient. In order to manage estates as efficiently and effectively as possible, IT leaders should review and refresh management solutions, modernising their approach and keeping pace with the demands of today and tomorrow.
IT leaders reveal endpoint manageability struggles
High performance computing (HPC) substantially improves computing performance and speed. It is no surprise organisations are increasingly opting for this emerging technology, given its capabilities to overcome the barriers seen in traditional PCs and processors. HPC systems can operate over one million times faster than typical desktop, laptop, or server setups. These superior speeds are further improved by the use of HPC accelerators, allowing organisations to rapidly process data at lower costs and with much lower latency. Computing’s latest research in this space, conducted in collaboration with Intel, finds only a fifth of organisations utilising HPC are also using accelerators, suggesting there is serious headway to be made in this area. IT leaders using accelerators overwhelmingly recognise their benefits, with 67 per cent of organisations agreeing they have improved the efficiency of workloads, while over 90 per cent report accelerators have enabled or will enable new HPC use cases for their enterprise. Delivering optimal performance depends on selecting the right processors and accelerators as well as ensuring the right hardware underpins them.
Taking advantage of built-in accelerators
In order to accommodate the massive processing power of HPC workloads, sizable memory bandwidth is required. Traditionally, this meant owning or leasing supercomputers to carry out HPC tasks. Thankfully, today HPC and accelerators are much more accessible and can be built into the Central Processing Unit (CPU). Initial success of HPC is reflected in IT leaders’ motivation to continue or increase their HPC utilisation – 97 per cent of respondents predict their use of HPC will increase or stay the same in the next two years. This is important for enterprises needing consistent and reliable results. HPC use cases range from data analysis and AI inferencing, to simulating and modelling test scenarios. Consequently, HPC is often at the forefront of industry developments. When asked how accelerator usage has affected their organisation, respondents reported significant improvements to their response times, monitoring abilities, and performance. Adopting HPC with built-in accelerators affords organisations the critical performance needed, enabling immediate and reliable computing power without the need to purchase new hardware or lease infrastructure. Organisations can save additional costs with this approach and concentrate on maximising their investments.
Actioning AI
The role of computing is paramount in applications like AI. For demanding calculations like this, dedicated accelerators enhance performance and scalability. Designed to maximise throughput for training and machine learning, while ensuring optimum efficiency, AI accelerators will scale across systems and workloads with high accuracy. AI accelerators achieve outstanding results in a much shorter time frame, optimising the execution of tasks as well as costs and organisational footprint. Accelerators underpinning AI greatly expand the capabilities of organisations focused on areas such as data science, enhancing the speed of results and generated insights for very large quantities of data and complex sets. Deep learning is considerably benefitted by HPC accelerators and AI, specifically for large multidimensional data sets. HPC, strengthened by the use of accelerators and applying AI, is making significant contributions to analytics, design, scientific visualisation and simulation, and much more. Accelerating possibilities, delivering on performance, ROI, and energy saving in the long-term are all top of mind for today’s organisations. Workloads have changed, meaning investing and employing the right hardware is vital in ensuring the benefits of HPC are fully realised. 44 per cent of organisations surveyed say HPC is having a transformative impact on their company, demonstrating the value in adoption. In a data-centric world, having a sound strategy that makes use of HPC and accelerators is critical in powering innovation and tackling today’s challenges.
To learn more about Computing’s research into accelerating your HPC strategy.
While cloud computing is undoubtedly here to stay, the term “declouding” is increasingly cropping up among technology circles. Organisations may have rushed to the cloud – as a means of transformation or survival, particularly during the Covid-19 pandemic and the need for remote access. While some workloads are well-suited to cloud computing, others may function better elsewhere. As a result, a growing number are considering “declouding”. Also known as “unclouding”, declouding involves organisations moving some of their workloads from the public cloud to on-premises, hybrid cloud or private cloud. Organisations may have different reasons for choosing to decloud, including performance, cost optimisation, data regulation, and security issues from poor configuration. A greater need for control or customisation of workloads not offered by the public cloud may also be driving factors. Therefore some may choose to move workloads back on-premises that require low latency, involve legacy applications, or where security is paramount.
Returning from the cloud
While the majority of organisations surveyed have not yet moved workloads away from the cloud, Computing’s research revealed that 13 per cent have, with 9 per cent having plans to do so in the future. It is likely those considering to decloud will increase in the coming years, given widespread concerns around costs, skills, and integration. Databases, web-facing application and HPC/GPU are the most common workloads organisations are moving back on-premises. These are generally more complex workloads, which may be unsuitable for a cloud environment and work better locally, which organisations may have discovered through experience. The most common reason for declouding is cost, chosen by 65 per cent of respondents, with 24 per cent choosing greater control or customisation, and integration. “Other” responses included internet latency and performance. Security, difficulties migrating legacy or bespoke applications, vendor lock-in, and a lack of cloud talent within the organisation were declouding motivations for only 6 per cent respectively. While the benefits of cloud technology may allow for cost savings over time, organisations are clearly concerned about the financial aspects of cloud migration, and if they have not yet achieved a return on investment may revert to on-premises. While cloud computing means there are no large upfront capital costs for hardware, the costs of ongoing subscriptions, combined with high operational expenditure and expenditure associated with moving workloads in and out of the cloud may mean costs soar. If organisations are not noticing a marked improvement as a result of cloud migration, this may be hard to justify.
Cloud considerations
When asked whether the benefits of cloud computing had been overstated, 38 per cent of respondents agreed with the statement, with 21 per cent disagreeing, and 40 per cent neither agreeing nor disagreeing. Rather than following the crowd and migrating workloads to the cloud en masse, it is important for organisations to carefully consider which workloads are best suited to the public cloud, and which are not, in order to avoid declouding further down the line. A hybrid approach to on-premises and the cloud is a good way to achieve this, allowing organisations to get the most from the benefits offered by the cloud while also lowering costs, having greater control over data and security, and avoiding vendor lock-in. Having a well thought-out cloud governance strategy also helps organisations keep track of what they are storing in the cloud, and whether workloads are in the right environment to perform at their best. From there, they can assess if declouding certain workloads is an appropriate next step.