Signals
Q1 2024
Emerging
tech trends
https://www.inc.com/jason-aten/why-microsofts-ceo-satya-nadella-says-its-co-pilot-ai-assistant-will-be-as-significant-as-pc.html
https://computerhistory.org/blog/audrey-alexa-hal-and-more/#:~:text=It's%20a%20modest%20start%3A%20The,high%20rack%20of%20supporting%20electronics
https://www.theguardian.com/technology/2023/jul/25/joseph-weizenbaum-inventor-eliza-chatbot-turned-against-artificial-intelligence-ai
https://www.seattlemet.com/news-and-city-life/2022/08/origin-story-of-clippy-the-microsoft-office-assistant
https://www.globenewswire.com/news-release/2023/08/16/2726758/0/en/Intelligent-Virtual-Assistant-Market-Size-Share-Analysis-Growth-Trends-Forecasts-2023-2028.html
https://www.bcg.com/publications/2023/how-generative-ai-transforms-customer-service
https://techcrunch.com/2023/11/09/humanes-ai-pin/
https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-economic-potential-of-generative-ai-the-next-productivity-frontier#key-insights
The Impact of AI on Developer Productivity: Evidence from GitHub Copilot
https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/unleashing-developer-productivity-with-generative-ai
https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/unleashing-developer-productivity-with-generative-ai
https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/unleashing-developer-productivity-with-generative-ai
https://www.finnegan.com/en/insights/articles/insights-from-the-pending-copilot-class-action-lawsuit.html
https://www.cbinsights.com/research/generative-ai-copilots-coders/
https://github.blog/2023-06-13-survey-reveals-ais-impact-on-the-developer-experience/
https://media.defense.gov/2023/Sep/12/2003298925/-1/-1/0/CSI-DEEPFAKE-THREATS.PDF
The State of Identify Verification 2023 Report
The State of Identify Verification 2023 Report
The State of Identify Verification 2023 Report
https://www.bloomberg.com/news/newsletters/2023-05-17/deepfake-detection-tech-becomes-a-focus-for-venture-capital
https://www.biometricupdate.com/202309/us-security-agencies-tout-biometrics-liveness-detection-to-defend-against-deepfakes
https://spectrum.ieee.org/deepfake
https://www.darpa.mil/program/semantic-forensics
https://spectrum.ieee.org/deepfake
https://www.zdnet.com/article/googles-new-tool-can-detect-ai-generated-images-but-its-not-that-simple/
https://www.theverge.com/2023/11/8/23951955/microsoft-elections-generative-ai-content-watermarks
https://forums.appleinsider.com/discussion/233810/tim-cook-calls-spatial-computing-in-apple-vision-pro-an-aha-moment-in-a-users-life
https://www.lvmh.com/news-documents/news/lvmh-and-epic-games-announce-strategic-partnership-to-transform-maisons-creative-pipeline-and-customer-experiences/
https://www.pymnts.com/metaverse/2023/bank-of-america-uses-virtual-reality-for-training-over-200000-staffers/
https://www.itransition.com/virtual-reality/retail
https://www.ualberta.ca/rehabilitation/research/rehabilitation-robotics/current-projects/projectdr.html
https://www.pcmag.com/news/spacetop-laptop-and-its-100-inch-ar-display-can-be-yours-for-2150
https://www.activate.com/insights-archive/Activate-Technology-and-Media-Outlook-2024.pdf
https://market.us/report/spatial-computing-market/
https://blog.google/technology/research/project-starline/
https://csa-iot.org/csa-iot_products/
https://www.wsj.com/articles/the-ai-boom-is-here-the-cloud-may-not-be-ready-1a51724d
https://www.axians.com/use-case/how-rotterdam-becomes-the-smartest-port-in-the-world/
https://www.aidot.com/blog/post/get-you-to-know-the-matter-protocol
https://www.mastercard.com/news/europe/en/newsroom/press-releases/en/2023/mercedes-benz-and-mastercard-introduce-native-in-car-payments/
https://www.thalesgroup.com/en/worldwide-digital-identity-and-security/iot/magazine/singapore-worlds-smartest-city
https://www.novatr.com/blog/singapore-world-smartest-city
https://www.statista.com/statistics/1190391/m2m-connection-growth-worldwide/
www.macrotrends.net/stocks/charts/NVDA/nvidia/market-cap
https://datacentremagazine.com/articles/efficiency-to-loom-large-for-data-centre-industry-in-2023
https://www.mckinsey.com/industries/industrials-and-electronics/our-insights/semiconductor-fabs-construction-challenges-in-the-united-states
https://www.cloudzero.com/blog/cloud-computing-market-size/#:~:text=Globally%2C%20the%20cloud%20computing%20market,CAGR%20of%2017.43%25%20through%202030
https://www.cnn.com/2023/10/18/tech/us-china-chip-export-curbs-intl-hnk/index.html
https://www.nytimes.com/2022/08/31/technology/gpu-chips-china-russia.html
https://bbnc.bens.org/semiconductors---page-3-key-inputs#:~:text=China%20is%20the%20main%20source,U.S.%20is%20highly%20import%2Ddependent.&text=China%20accounts%20for%2079%25%20of,9%25%20of%20global%20silicon%20production.
https://bbnc.bens.org/semiconductors---page-3-key-inputs#:~:text=China%20is%20the%20main%20source,U.S.%20is%20highly%20import%2Ddependent.&text=China%20accounts%20for%2079%25%20of,9%25%20of%20global%20silicon%20production.
https://www.the-waves.org/2023/01/21/rise-of-tsmc-why-and-how-to-replicate/
https://www.whitehouse.gov/briefing-room/statements-releases/2023/08/09/fact-sheet-one-year-after-the-chips-and-science-act-biden-harris-administration-marks-historic-progress-in-bringing-semiconductor-supply-chains-home-supporting-innovation-and-protecting-national-s/#:~:text=In%20the%20one%20year%20since%20CHIPS%20was%20signed%20into%20law,jobs%20in%20the%20semiconductor%20industry.
https://blog.cultos.io/4-brands-that-are-successfully-tokenizing-loyalty-in-the-web3-era-d4c4479a0b15
https://www.ledgerinsights.com/lufthansa-rewards-app-nfts-uptrip/
https://www.innovation.pitt.edu/2022/10/token-identity/
https://www.fortunebusinessinsights.com/big-data-analytics-market-106179
https://www.newvantage.com/_files/ugd/e5361a_247885043758499ba090f7a5f510cf7c.pdf
https://www.prnewswire.com/news-releases/cloud-ai-market-size-to-grow-usd-887-billion-by-2032-at-a-cagr-of-35-8--valuates-reports-301969170.html#:~:text=The%20market%20for%20cloud%20artificial,provide%20digitalization%20to%20end%20users.
https://preyproject.com/blog/the-key-pillars-of-zero-trust-mastering-the-basics
https://www.secureworld.io/industry-news/benefits-zero-trust-architecture#:~:text=Additionally%2C%20the%202021%20Zero%20Trust,ability%20to%20enhance%20data%20protection.
https://nvidianews.nvidia.com/news/aws-nvidia-strategic-collaboration-for-generative-ai
https://www.pymnts.com/artificial-intelligence-2/2023/aws-and-nvidia-expand-collaboration-to-deliver-generative-ai-infrastructure/
https://finance.yahoo.com/news/artificial-intelligence-biotechnology-market-share-121500910.html
https://www.pharmaceutical-technology.com/analyst-comment/generative-ai-revolutionise-drug-discovery/?cf-view
https://www.cio.com/article/1257351/generative-ai-is-pushing-unstructured-data-to-center-stage.html
https://www.linkedin.com/posts/asthanamanish_oracles-vision-for-the-futurelarry-ellison-activity-7110380588968148994-T2GG
https://ai.plainenglish.io/understanding-pix2pix-gan-e21c2bedd213
AI
Compute
Data
Convergence
In today’s evolving technological landscape, businesses must adapt swiftly to harness new opportunities and remain relevant. This issue of Mastercard Signals explores tech trends poised to reshape commerce over the next three to five years.Advances in three areas — artificial intelligence, computational power and data technology — are converging to propel these trends forward. As they spur innovation, technology will become more intuitive, interactive, immersive and embedded in our daily lives — with significant implications for finance, retail and other sectors.
AI
Section 1 examines how generative AI could become integral to our daily lives. From ubiquitous digital assistants to supercharged software development,
we uncover the complex interplay between user behaviors, interfaces and responsible AI. Additionally, we review the challenges presented by AI-powered deepfakes, highlighting the growing need for effective detection technologies and content authentication standards.
2. Enhancing code
Supercharging software development
3. Race to detect
The battle against deepfakes
Jump to section
Jump to section
5. Connected Tech
Faster, smarter networks
6. Power Surge
The growing need for capacity
Compute
Section 2 provides an overview of innovations in computing, including spatial interfaces that enable new ways to interact with technology beyond conventional screens. We also cover the emergence of next-generation networks that enhance automation and intelligence across devices and examine the increasing demand for computational power to support advanced software and services.
Finally, we examine convergences across these trends. This final section highlights how AI, computation and data analytics are not isolated domains but interconnected components, collectively driving innovation across industries and shaping the future of commerce.
Jump to section
Innovation is exploding across the AI domain, with numerous emerging use cases, potential threats and implications for shopping, travel, gaming, entertainment and other industries. We explore three AI trends likely to have big impact in 2024 and beyond — more sophisticated digital assistants, supercharged software development and the battle to defang malicious deepfakes.
Personal copilots
Generative AI is expanding the power and reach of digital assistants, propelling them from simple task performers to invaluable personal and professional aides. These advanced apps are evolving to perform jobs ranging from travel booking to nutritional and life coaching to language translation. They are also increasingly equipped to provide more personalized shopping guidance due to their near-human communication skills and their ability to learn and understand specific user preferences.
Tech drivers
Familiarity with AI bots like Siri and Alexa has laid the groundwork for the broad adoption of a new crop of digital assistants. These gen AI-driven copilots understand more complex commands and queries, helping people with tasks that require adaptation to changing circumstances and an ability to "think" laterally, not just according to pre-programmed logic. They can engage in contextually appropriate back-and-forth communication with people, generate tailored answers, perform complex tasks and create more personalized interactions. Four drivers have increased the potential of these personal copilots. The first is the engineering talent behind them. The second is the vast amount of training data used to improve how these AI models respond, making them adept at natural language processing. Third, there are the advanced semiconductor chips used to train models. And fourth, there are the cloud platforms that aggregate the necessary computing capacity.
"This is as significant as the PC was in the ‘80’s, the Web in the ‘90’s, mobile in the 2000s, cloud in the 2010s."
Satya Nadella
| Microsoft CEOi
THE HISTORY OF DIGITAL ASSISTANTS
Audrey, a six-foot rack of electronics at Bell Labs, was the first machine to recognize speech — the digits zero to nine spoken by its inventor.ii
2023
1966
1990s
1996
2010s
1952
Emerging use cases
Shopping assistance
Travel services
Integration into existing solutions
Ensuring AI decisions are responsible — transparent, fair and unbiased — is crucial. Outputs must be carefully managed to avoid reflecting, or even amplifying, societal and underlying data biases.
Bear Case
Bull Case
Near-term digital assistants underwhelm users and see low adoption beyond search. The tech takes longer to mature and develop.
Ubiquitous usage by individuals for critical tasks. New services will emerge, ranging widely from therapy bots to money managers.
Plausible
futures
Technical limitations
Wearables that incorporate gen AI — like the next iteration of the Meta Ray-Ban smart glasses, released in late 2023 — are starting to impact the market, and innovators are devising newer interfaces to deliver always-on assistants.An example is the AI Pin manufactured by start-up Humane. Designed to be affixed to the user’s clothing, the pin offers voice access to ChatGPT and includes functions such as instantaneous translation, new types of messaging and the ability to deliver nutritional information on food items scanned with its camera.vii
Integrating assistants
The intelligent virtual assistant market is forecasted to grow from $11 billion in 2023 to $46 billion in 2028, a CAGR of 32.7%.v
In a global survey of customer service executives, 95% said they expect AI bots to serve their customers within the next three years.vi
$46bn
These early use cases are only the beginning. Soon this technology could boost productivity and efficiency at home and at work, becoming more ingrained in our lives. Adoption could accelerate in the next few years, driven by increased user familiarity, improved interfaces and greater contextual relevance. AI providers and businesses using these tools must ensure that they are doing so responsibly and with regard to their privacy implications.
95%
Enhancing code
Generative AI could change software development, enhancing productivity and innovation by automating standard developer tasks. Gen AI applications can assist with legacy code and support new product development — leading to 20% to 45% higher engineering productivity, according to McKinsey.viii While these apps bring efficiency and automation, human oversight remains critical, particularly in monitoring for and correcting biases and other errors that AI might introduce.
GitHub Copilot is a prime example of this technology in action. It creates code snippets based on software developers' natural language descriptions. A study by GitHub found that developers using Copilot could complete a coding task 55% faster than those not using it.ix
GIT HUB
Tech drivers
Software copilots can assist software engineers in writing source code, designing software architecture, testing, understanding existing code (essential for code maintenance) and more creative tasks such as UI/UX design. These emerging AI tools understand natural language inputs, so developers can describe the code they need in simple terms and leave it to the AI to write that code in the desired programming language. Non-engineers could to some extent do some light engineeering work, making it possible for organizations to potentially redesign their dev processes.
Gen AI’s power to facilitate software development could have compelling effects in certain important coding-intensive domains and software services.Systems integrators could find themselves in an AI-powered arms race, each of them adopting the latest gen AI coding solutions to gain competitive advantage over the others. This dynamic would raise the state of the system integrator’s art — and be to the advantage of financial firms and other tech-intensive organizations that employ integrators, giving them more choices and leverage in renegotiating outsourcing contracts.Business analysts and consultants could for their part see their stock rise with the advent of natural language interfaces and gen AI-supported no-code/low-code tools. Now they could better perform certain development tasks necessary for their clients — rather than outsourcing them at those clients’ expense.
At the same time, consultants could find themselves under pressue as their clients take advantage of no-code/low-code tools themselves to make outsourcing less necessary.
Battle for supremacy
User trust and acceptance
Privacy and data security
Intellectual
property
Challenges to navigate
Data & Tech Responsibility concerns
Intellectual
property
Technical limitations
User trust and acceptance
Privacy and data security
Data & Tech
Responsibility
concerns
Data & Tech Responsibility concerns
AI assistants may require access to personal data for some use cases. Safeguarding this data and ensuring user privacy is paramount.
Building trust is essential given concerns about AI replacing humans in the workforce and knowing too much about individuals.
Despite advances, AI assistants are still evolving and may face challenges in understanding context or managing complex tasks.
Gen AI models train on vast amounts of internet data, some of which is copyrighted. Users of gen AI outputs could unwittingly expose themselves to legal issues. In December 2023, The New York Times became the first major U.S. media corporation to file a gen AI-related copyright infringement suit, against OpenAI and Microsoft.
Potentially
sub-par and unreliable
AI-generatedcode
Integration with existing systems
IntellectualPropertydilemmas
Language limits
Avoiding
over-reliance
Ensuring the code generated by AI is reliable, efficient and bug-free requires human oversight. There's also the risk of developing functionally correct code that is inefficient or under-optimized for its specific use case.
Integrating AI tools with existing development environments and workflows can be complex and may require significant adjustments.
IP issues could arise as AI coding assistants possibly use bits of copyrighted code in their training. A class-action lawsuit is pending against developers’ platform GitHub, its owner Microsoft and OpenAI for using platform users’ code and other materials in creating GitHub Copilot.xiii
It still needs to be determined whether developers can express complex algorithms precisely enough in natural language for an AI to generate appropriate code.
As developers gain confidence in AI-generated code, they may be tempted to "cut and paste" it directly into their codebases, which could have negative results. Gen AI tools require human oversight, and hand-coding will remain necessary.
Challenges to navigate
Avoiding
over-reliance
Language limits
IntellectualPropertydilemmas
Integration with existing systems
Potentially
sub-par and unreliable
AI-generatedcode
Slow adoption in large enterprises due to IP risks and integrationchallenges related to data efficacy.
Bear Case
A global shift towards nearly complete AI-powered software development, with AI-first approaches being taught inuniversity programs.
Bull Case
Plausible
futures
Copilot market map
Productivity
Knowledge management
Querying
Task automation
Assistant
Scheduling
Presentation
HEALTHCARE
Administrators
Providers
Design and editing
Audio
Image
Video
Software engineering
Marketing, sales and support
Human resources
Electronics
Construction
Finance and accounting
INCUMBENTS
Governance
goVERNANCE
41
41
15
14
9
5
5
9
12
3
24
21
5
5
3
12
9
14
15
21
24
9
Ready for take-off
Funding for gen AI software copilotsxiv
2020
2021
2022
2023
92%
As gen AI matures, its impact on software development is poised to grow substantially.
This technology promises time and cost savings and opens new avenues for innovation. The ability of AI copilots to learn from vast repositories of existing code means they could continually evolve and become more powerful and versatile. Consequently, the software development process could soon become more efficient and creative.
A recent study conducted by GitHub and Wakefield Research sheds light on the impact of AI on the developer experience.
The survey of 500 U.S.-based developers from companies with 1,000-plus employees revealed that 92% already use AI-powered coding tools, signaling a significant shift in the field.xv
Race to detect
AI tools that generate highly realistic media — audio, video, images and text — are increasingly used to create "deepfakes" for fraudulent activities, scams and disinformation campaigns. This has ignited a battle between cybercriminals on one side and companies and governments on the other. The rise of malicious deepfakes has increased the market for detection tools and spurred new regulations, emphasizing the importance of data provenance and AI's role in bolstering cybersecurity responses.
Deepfakes can be dangerous, and caution is warranted. Yet concerns shouldn't overshadow the fact that gen AI-powered representations can also have productive uses in entertainment, education, training and other areas where there is a continual need for high-quality textual, visual and other content.
“The most substantial threats from the abuse
of synthetic media include techniques that threaten
an organization’s brand, impersonate leaders and financial officers, and use fraudulent communications to enable access to anorganization’s networks, communications, and sensitive information.”
The U.S. NSA, FBI and Cybersecurity and Infrastructure Security Agencyxvi
Fraudulent and misleading content items have always existed, but now there are more tools for making them and more ability to make them at scale. Deepfakes are typically created using a type of machine learning architecture known as a generative adversarial network (GAN). A GAN involves two competing neural networks: a generator and a discriminator. The generator creates the content, while the discriminator evaluates its authenticity. In essence, the generator attempts to fool the discriminator. Continuous feedback and refinement make the generator increasingly adept at producing realistic and convincing media — and the discriminator increasingly adept at detecting whether the media is real or fake. This constant feedback loop enhances the realism of the generated media, presenting both challenges and opportunities.
Tech drivers
Generator vs. discriminator
How a GAN workslxviii
46%
46% of businesses have been targeted by identity fraud using deepfakes.xvii
37%
37% of them were targeted by deepfake voice fraudxviii and 29% by deepfake videos.xix
42%
The deepfake detection market is expected to grow 42% annually through 2026.xx
Accuracy issues
Lack of public awareness
Advancementof deepfake
tech
Lack of
industry standards
Achieving high accuracy in detection is crucial to avoid the consequences of false positives or negatives.
Increasing public education about deepfakes is vital to mitigate risks.
Keeping up with the development of deepfake technology is a constant challenge for makers of detection tools.
Given the range of threat sources, global coordination is required to develop content authentication approaches.
Challenges to navigate
Lack of
industry standards
Advancementof deepfake
tech
Lack of public awareness
Accuracy issues
Accuracy issues
Widespread deepfakes disrupt various sectors, from news to e-commerce. In response, organizations slow their use of gen AI while the regulatory response stifles innovation.
Bear Case
Industry and government successfully establish clear guidelines and standards to manage deepfake content. Innovators develop foolproof tools to identify fraudulent material.
Bull Case
Plausible
futures
New detection tools
U.S. security agencies advocate liveness-based identity verification technologies to counter deepfake attacks in video and voice authentication systems.xxi
Intel’s FakeCatcher software detects deepfake videos by analyzing color patterns on the face that indicate blood flow, a feature typically absent in AI-generated content.xxii
DARPA's Semantic Forensics project aims to develop algorithms that can analyze media content for inconsistencies in language and logic, indicating potential manipulation.xxiii
Researchers at the University of California, Santa Barbara, have developed PhaseForensics, a tool that differentiates deepfake videos by analyzing facial motion.xxiv
Tools like Google's SynthID and Microsoft's content-credentials-as-a-service can confirm the origin of AI-generated content and ensure its authenticity.xxv xxvi
As the battle against deepfakes intensifies, developing detection technologies and strategies becomes increasingly crucial. The rapid evolution of these technologies highlights the urgent need for more comprehensive solutions to ensure digital media integrity and combat the spread of misinformation. The concerted efforts of industry, government and academic institutions to create robust detection and prevention strategies indicate a way forward in this escalating cyber arms race.
Advances in chip architecture, cloud computing, connectivity, mixed reality devices and processing at the edge have the potential to transform financial services, consumer experiences, healthcare and more. In this section, we examine the impact of spatial computing, smarter networks and increased computational power.
Spatial interfaces
Spatial computing could transform human-machine interaction, enabling people to use hand gestures and eye movements to engage with virtual objects, images and applications displayed across their fields of vision through headsets and smart glasses. This can make possible immersive mixed-reality experiences that blend the digital and physical worlds and deliver an unprecedented level of authenticity — as if the user were inhabiting a movie. As spatial computing technology matures over the next several years, it could integrate into our daily lives, transforming shopping, commerce, education, healthcare, manufacturing, entertainment and more.
Liveness
Detection
Blood-flow
analysis
SEMANTIC
ANALYSIS
Movement
analysis
Content
origin
verification
Key technological advancements driving this trend
Spatial mapping and computer vision
These technologies model information about the user's physical space, making it possible to place interactive digital content in it.
These devices allow users to perceive and interact with virtual environments overlaid on their physical surroundings.
Advanced headsets, smart glasses and AR contact lenses
Voice recognition, haptics and hand/body/eye tracking systems facilitate intuitive interaction with digital content.
Interactive technologies
Advances in cellular connection, Wi-Fi and other connectivity tech are establishing the speedy, low-latency basis for spatial computing.
Better connectivity
Spatial sound, sensors, photogrammetry and other tools work in unison to enrich the spatial computing experience.
Enhanced spatial experience tools
The data
privacy issue
Hardware limitations
Health and
safety concerns
Content creation and ecosystem
development
Spatial computing platforms and tech vendors will likely have access to novel types of user information. Users’ physical movements could betray sensitive health information. Data from eye-tracking solutions could indicate what users look at. This could deepen the already charged societal conversation about data privacy and collection.
Creating lightweight, comfortable, unobtrusive, affordable and stylish hardware for mass market adoption has been problematic. A new crop of devices still has far to go in miniaturization and wearability. Numerous devices, including Google Glass and Magic Leap headsets, have struggled to gain wide adoption.
Societal unease with immersive technologies may have played a role in their failure to achieve mass market success by now. They have been blamed for problems including obesity, myopia, depression, social atomization, dissociation from reality and beyond. Regulatory and industry guidelines on the proper use of these tech tools could emerge.
Developing sufficient high-quality content and applications to attract and retain users will be crucial for the mainstream adoption of immersive tech. The bar is high: Content and apps need to be compelling enough for users to purchase expensive hardware and commit to wearing it despite potential inconveniences like discomfort and alienation from surrounding environments.
Challenges to navigate
"I believe … [in] how profound spatial computing is. When you’ve tried it, it’s an ‘aha’ moment, and you only have a few of those in a lifetime."
TIM COOK
| APPLE CEO xxvii
Currently, spatial computing is used in systems such as GPS navigation, location tagging, ridesharing apps and smart devices controlled through virtual assistants. It also plays a pivotal role in virtual, augmented and mixed reality (VR, AR, MR) platforms. The next wave features sophisticated headsets and smart glasses, enhancing everything from educational simulations to collaborative telepresence and interactive entertainment.
Tech drivers
Content creation and ecosystem
development
Health and
safety concerns
The data
privacy issue
Hardware limitations
Slow adoption beyond niche users (like gamers). A precedent would be the metaverse, adoption of which has been slow, due to cumbersome hardware and other factors.
Bear Case
Breakthroughs in the design and adoption of headsets and other hardware lead to widespread use in retail and entertainment, transforming those sectors.
Bull Case
Plausible
futures
An Israeli start-up is using spatial computing to circumvent the laptop screen. The company, Sightful, has introduced Spacetop, a portable computer without a monitor. Instead, it’s equipped with augmented reality goggles that present a 100-inch multi-window screen fixed in space in front of the user.xxxii
Untethered from
the laptop
Companies working in
the spatial ecosystemxxxiii
Consumer use cases (3D video calls, immersive maps, games)| Enterprise applications (interactive virtual training, digital twins)
1. Content, experiences and social interactions
Operating system layers | App ecosystems | Developer platforms | 3D engines | Security and identity | Payment platforms
2. Platforms and enablers
Headsets and glasses | Headphone and speakers | Auto HUDs |
Mobile phones | Consoles and PCs | Accessories and components
| Wearables | Cameras | TVs
3. HARDWARE and DEVICES
Network/connectivity | 5G/6G | Cloud | Wifi-8 |
Edge solutions (MEC) | Semiconductors (GPUs)
4. Infrastructure
The spatial computing market is on a steep growth trajectory, expected to reach $620.2 billion by 2032.xxxiv
$620.2bn
Big investments from tech companies indicate a promising future for spatial computing. As hardware and software evolve, this technology could become integral to our daily lives. While spatial computing’s advancement may be gradual, its potential to transform how we interact with technology and our environment is vast, heralding a future where digital and physical realities merge.
Apple's Vision Pro headset, equipped with a spatial operating system and multiple sensors and cameras, exemplifies advancement in this field.
Tech companies including Meta and Microsoft have heavily invested in the development of MR headsets, each targeting different market segments and use cases.
Google's Project Starline uses advanced AI, spatial audio and other technologies to create a video call experience with a lifelike sense of 3D depth and human presence. The company describes the solution as a “magic window” through which users can see the people they’re talking with.xxxv
Connected tech
Advances in network technologies are ushering in a new era of automation, interconnectivity and intelligence. Improvements in architecture, standards and data transmission could bring more intelligent experiences to stores, offices, homes and industrial settings. Our world will become more “instrumented,” meaning that we’ll be increasingly able to engage and interrogate it from any device (including new spatial devices), at any time, in any place. Shopping journeys could be more contextually relevant and guided, while experiences involving the mass movement of people, such as public transit and stadium entrances, could become automated. Of course, progress will be needed on multiple fronts — technology, interoperability, consumer buy-in — so advances will likely occur in intermittent steps.
Tech drivers
Several technologies form the foundation of next-gen networks:
Ultra-wideband (UWB) transmission
This short-range, low-power communications protocol transfers relatively large amounts of
data and gives messaging applications advantages over the near-field communication
(NFC) that enables contactless payments.
Apple, Samsung and Google smartphones incorporate UWB.
Matter is a communications standard that allows interoperability between smart home components made by different manufacturers and improves device security. Thread is the connection protocol that connects Matter devices. Approximately 2,000 smart devices — from dishwashers to smoke sensors to light bulbs — use Matter and Thread.xxxvi
Matter and Thread
By bringing computational power and data storage closer to the location where they are needed, edge computing reduces latency and speeds up data processing.
Edge computing
Massive MIMO technology enhances network capacity by using multiple radio antennas to transmit and receive data simultaneously.
Massive multiple-input, multiple-output (MIMO)
Cloud platforms such as Microsoft Azure, Amazon Web Services and Google Cloud are evolving to accommodate the increasing demand for advanced applications including generative AI.xxxvii
AI-ready cloud architecture
The port of Rotterdam, Europe’s largest port by cargo tonnage, is undergoing a multi-year digital transformation that uses sensor-equipped IoT devices to collect real-time information on ship movements, weather, water depth, berth use and more. The data is processed in the cloud, shared through a digital dashboard and used to guide ships through port with optimal efficiency, safety and speed — increasing the port’s capacity and reducing supply chain costs.xxxviii
Smart ports
advance global commerce
Security and privacy
The digital divide
Interoperability standards
Integration and maintenance
Interconnected devices that collect and transmit data can capture vast amounts of information about people and serve as targets for hackers. Edge devices like sensors integrated into Internet of Things (IoT) systems can be particularly vulnerable.
Disparities in access to advanced network technologies across different regions and socio-economic groups could lead to unequal development. They require consideration.
Ensuring that different devices and systems can communicate seamlessly is a challenge, particularly given the vast array of manufacturers and standards.
Bridging intelligent networks with legacy systems and maintaining the resultant stacks are complex tasks.
Challenges to navigate
Integration and maintenance
The digital divide
Security and privacy
Interoperability standards
Challenges in achieving interoperability due to service and standard fragmentation.
Bear Case
Enhanced connectivity and interoperability across various sectors, driving intelligent experiences.
Bull Case
Plausible
futures
2023 M2M connections
14.7bn
There were an estimated 14.7 billion machine-to-machine (M2M) connections in 2023, up from 8.9 billion in 2020, highlighting the expanding IoT network.xliii
2020 M2M connections
8.9bn
Arguably the world’s premier smart city, Singapore offers residents a variety of services through its networks. This includes a transportation system that, in addition to offering contactless payment, deploys an “open data” system to collect fare card data that managers use to reduce overcrowding and optimize service.xli
Singapore is also deploying smart digital solutions in water and energy management, among other domains.xlii
Smart SGP
Advancements in network technologies are not just enhancing connectivity: They are reshaping our living and working environments. As these technologies continue to mature, we expect to see smarter homes that are more responsive to our needs, retail experiences that are more personalized than ever before, and manufacturing processes that are more efficient and sustainable.
Power surge
The increasing demand for advanced software applications — such as generative AI, deep learning and virtual reality — is necessitating more computational power. High-performance computing is evolving to provide it. Advancements in chip technology and cloud computing in the short term, and the potential of quantum computing in the medium to long term, promise to deliver processing power far exceeding current capabilities.
Innovation in semiconductor materials and architectures enables chips with incredible speed and capacity. Central processing units (CPUs) made of billions of transistors handle single
tasks rapidly, while graphics processing units (GPUs) with specialized cores excel in parallel processing tasks well suited for gaming and AI. Integrating neural processing units (NPUs) with GPUs further improves AI processing. These AI-accelerated processors are ideal for preparing pre-trained neural networks for the all-important inferencing stage of AI — where capabilities learned during training are used to make predictions.
Next-generation chips
Tech drivers
Technological developments in this area include:
Cloud platforms are evolving to offer more robust and scalable computing resources, enabling complex computations and storage solutions for many applications — in an on-demand, pay-as-you-go package requiring minimal up-front capital costs.
Cloud computing
Quantum computers use the principles of quantum mechanics to process information in ways that traditional computers cannot, offering a potentially exponential increase in processing power.
Quantum computing
Fusion, an emerging method of highly efficient nuclear fission-based energy generation, could eventually help meet computing’s energy demands, at the same time as it lessens data centers' environmental impact.
Fusion power
Thermal management
Hype and excessive optimism
Energy consumption and sustainability
Shortages
Dealing with the heat generated by high-performance computing so it does not impact performance and longevity is a persistent challenge.
New technologies like quantum may take longer to develop than expected given the hype surrounding them. Unrealistic expectations could lead to disillusionment and potentially less funding for research and development.
More computing power means more data centers, which are energy intensive. Now responsible for an estimated 3% of the world's energy use, data centers could account for 4% by the decade's end.xliv Green computing and data center practices, which stress circularity and efficiency in energy use and materials, will be necessary to counteract this demand.
Demand has led to periodic GPU shortages, leaving businesses to wait.
Challenges to navigate
Shortages
Energy consumption and sustainability
Hype and excessive optimism
Thermal management
Graphics processing unit performance has increased roughly 7,000 times since 2003, according to the 2023 AI Index Report from Stanford University. In 2023, chipmaker NVIDIA and Israeli start-up Quantum Machines introduced the first system to combine quantum technology with cutting-edge classical computing. “Quantum-accelerated supercomputing has the potential to reshape science and industry,” the companies said in a statement, adding that it “will enable a new generation of innovators to solve some of the world’s greatest challenges.”
Era-defining chips
NVIDIA’s share price climbed more than 1,300% in the past five years (2019 – 2023) and the company ended 2023 with a market valuation exceeding $1.2 trillion.xliv
$1.2trillion
$260bn
Up to $260 billion in construction projects are planned to expand semiconductor fabrication in the U.S.xlvi
Resource scarcity leads to slower development of new software capabilities.
Bear Case
Advances in green computing enable development of innovative, sustainable applications.
Bull Case
Plausible
futures
return to contents
return to contents
return to contents
return to contents
return to contents
The quest for increasing computing power is intertwined with geopolitical dynamics. Nations are focused on securing access to essential hardware and natural resources critical for technological advancement, recognizing compute as a vital component of national security.
U.S.-China trade dynamics: The U.S. has imposed restrictions on semiconductor sales to China, citing national security concerns including potential benefits to the Chinese military.xlviii
Geopolitical competition
Computing power is on the cusp of a shift. The immediate future will see new chip technologies and cloud computing capacities help satisfy growing demands for processing power. In the longer term, quantum computing promises to revolutionize computing capacity, opening possibilities currently beyond our reach. Balancing technological progress with responsible energy use and sustainability will be crucial as we advance, particularly in data center operations.xlv The future of computing is not just about more power; it's also about more thoughtful, efficient and sustainable use of that power.
return to contents
Restrictions and trade implications
U.S.-Russia trade response: Similar restrictions were applied to Russia following its invasion of Ukraine in early 2022, impacting semiconductor trade.xlix
Silicon and rare earth elements: China dominates the global supply,
producing 79% of the world's raw silicon and 80% of certain rare earth elements essential for semiconductor production.l
U.S. dependence on imports: The U.S. significantly relies on imports for these critical elements, with dependencies ranging from 30% for copper to 100% for gallium and arsenic.li
Raw materials and supply chain
The semiconductor industry: The concentration of high-performance
chip manufacturing,lii exemplified by Taiwan-based TSMC’s dominance,
could be geopolitically risky and is driving efforts to expand
semiconductor production in the U.S.liii
The Future of resource competition
As nations vie for technological supremacy, the competition for raw materials could become a critical factor in the global race for computing power. This resource competition may emerge as a wildcard, influencing the balance of power in pursuing advanced computing capabilities.
2030
2028
The global cloud computing market is expected to reach $1 trillion by 2028 and
$1.6 trillion by 2030.xlvii
$1.6tn
$1tn
Innovations in data technology are helping enterprises build their brands, seize competitive advantage and strengthen data security and consumer privacy. This section addresses the tokenization of data to expand its utility, the use of advanced analytics to unearth game-changing business insights and emerging methods to protect data.
Tokenized value
In data security, tokenization is a means of safeguarding sensitive information, such as a credit card number, by replacing it with a unique, random code (called a token). But tokenization can also enable digital representations of assets — real estate, stocks, intellectual property and other sources of value — on a blockchain or network to make it transferable, tradeable and secure. This technology is being applied to various data types, from biometric attributes for identity authentication to customer reward points for easier redemption, paving the way for new data interoperability and programmability standards.
A method for protecting sensitive information that involves substituting it with intrinsically meaningless combinations of letters or numbers, known as tokens. Used for payment processing in digital wallets such as Apple Pay and Google Pay.
A. Alphanumeric tokenization
The “scrambling” of data to make it unreadable without the help of an encryption key. Used to create digital assets, also known as tokens, that are transferable or tradeable on blockchains or other networks.
B. Cryptographic tokenization
Tokens created on one blockchain platform or network must be easily transferable to or recognizable on another, for maximum utility.
C. Interoperability and standardization
Alphanumeric tokenization is managed within databases that must remain secure. Cryptographic tokenization needs to avoid becoming vulnerable to smart contract or software errors.
D. The impregnability imperative
Tech drivers
Tokenization depends on the following factors:
Public
perception
Data
privacy
Security
risks
Regulatory compliance
Tokenization and digital assets are complex matters that can be misunderstood. A lack of understanding by users can hinder adoption and trust.
There are concerns about how tokenized personal information is used and stored and who has access to it, especially in the case of biometrics.
While tokenization enhances security, it's not immune to cyber threats. Smart contract vulnerabilities, blockchain platform security breaches and the risk of token theft remain concerns.
Navigating the complex and evolving regulatory landscape surrounding digital assets and tokenization is a significant challenge. Jurisdictions have varying regulations regarding using and trading tokenized assets, impacting their adoption.
Challenges to navigate
Regulatory compliance
Security
risks
Data
privacy
Public
perception
Incompatibility and competing standards hinder interoperability and the widespread adoption of tokenization.
Bear Case
Tokenizing loyalty and digital identity leads to global acceptance, driving innovative applications.
Bull Case
Plausible
futures
Major brands, including Nike, Starbucks, Dolce & Gabbana and Lufthansa, are adopting tokenization for consumer rewards programs.liv lv
Tokenization
in action
Innovative applications like HENY's app, which uses non-fungible tokens for patient data, suggest the potential for tokenization in healthcare and privacy protection.lvi
The horizon for tokenization is expanding, with emerging applications across healthcare, finance and cybersecurity. By enabling different types of data tokenization, this technology enhances security and opens new avenues for data use, making interactions more personalized and secure. As we move forward, tokenization could play a crucial role in managing and utilizing sensitive information securely.
return to contents
Insight factories
Companies have always tried to glean meaning from data. But now, with magnitudes more data available to them, they’re leveraging advanced analytics, machine learning and AI to extract deeper insights, improve decision making and transform the business landscape. Organizations with rich data of their own can combine it with other datasets to produce new knowledge. Gen AI is further improving data analysis by allowing businesses to analyze previously inaccessible unstructured data, including social media posts and multimedia content. This capability could lead to a paradigm shift in data mining and offer a competitive advantage to organizations with large data repositories.
Using sophisticated analytical methods to interpret complex datasets yields actionable insights.
A. Advanced analytics
Employing AI and machine learning to analyze data patterns, predict outcomes and automate decision-making processes.
B. Machine learning and AI
Creating personalized customer experiences using data to suggest products, services and content.
C. Recommendation systems
Unstructured data will become more valuable and lead to better analysis as more tools can work with it.
D. Access to unstructured data
Tech drivers
The technologies driving this trend include:
Skill gaps
and talent
shortages
Implementation and maintenance costs
Responsible
use of data
Complexity
of data integration
Data quality
and integrity
In some markets, there could be a shortage of skilled professionals in data science, analytics and AI, which can hinder the development and operation of insight factories.
Establishing and maintaining an adequate data analytics infrastructure can be costly, especially for small to medium-sized enterprises.
Responsible data collection and utilization practices are key, especially when it comes to personal data used for predictive analytical purposes.
Assimilating data from various sources and formats and ensuring that it is harmonized and consistent for analysis can be complex and resource-intensive.
Data subject to analysis must be accurate, complete and reliable. Poor quality data can lead to misleading conclusions and flawed decision-making.
Challenges to navigate
Data quality
and integrity
Complexity
of data integration
Responsible
use of data
Implementation and maintenance costs
Skill gaps
and talent
shortages
Data quality and accessibility challenges impede insightful analytics.
Bear Case
Advancements in technology and infrastructure usher in a new era of data-driven enterprise insights.
Bull Case
Plausible
futures
Growth in insights
2030
2022
The data analytics market is projected to grow from $272 billion in 2022 to
$745 billion in 2030.lvii
$745bn
$272bn
A significant increase in enterprise spending on analytics is expected, surpassing other software types.
The role of data scientists and analysts has expanded dramatically:
More than 80% of companies had a chief data officer in 2022 — up from 12% in 2012.lviii
2012
2022
+68%
The cloud AI market is expected to grow from $43 billion in 2022 to reach $887 billion by 2032.lix
2022
2032
The future of enterprise decision-making is increasingly rooted in data. As technology evolves, the ability to gather, process and analyze data will become even more sophisticated, enabling businesses to make more informed predictions and decisions. This trend could stimulate further innovation in AI and analytics as companies strive for a competitive edge through enhanced data capabilities. The central challenge lies in balancing the strategic use of data with data responsibility considerations — including the need to protect privacy and avoid bias.
return to contents
Protecting data
In the digital age, the emphasis on responsible data management is reshaping enterprise strategies. Companies increasingly recognize the importance of data integrity, protection and privacy, particularly in light of sophisticated data breaches and reported misuse of data. Organizations that want to earn consumer trust will not only comply with regulations but work to anticipate changes in the broader data responsibility environment before they happen.
Many organizations are adopting new security measures to protect data. Zero trust architecture is designed to ensure that a breach on the perimeter of an organization's IT infrastructure — i.e., at a web or edge node — is restricted to a single system and the attacker cannot penetrate further into the organization's environment.
A. Advanced security measures
Tech drivers
Critical factors in data protection include:
Solutions that enhance user privacy — such as data anonymization and encryption and data clean rooms, digital environments where anonymized, aggregated information is shared — can keep personal information confidential.
B. Privacy-enhancing technologies (PETs)
Regulations such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) set higher data protection and privacy standards for digital players. More regulations are expected to be adopted in 2024 in the privacy and AI space, including the EU AI Act.
C. Regulations
Effectively managing data requirements, creation, standardization, integration and storage. The ability to trace data's origin, lineage, flow and transformations helps maintain transparency and data integrity and makes compliance easier.
D. Need for better data oversight
Growing consumer awareness and data privacy concerns influence business practices and technology adoption. The rise in high-profile data breaches has heightened public and regulatory scrutiny of how companies handle sensitive information.
E. Growing consumer awareness
The first chatbot answered questions transmitted from an electric typewriter to a mainframe. It was named Eliza after Eliza Doolittle in the play “Pygmalion."iii
People carried personal digital assistants that synched with their desktop computers to manage contacts and calendars. Early models included the Tandy Z-DPA, the Apple Newton MessagePad and USRobotics’ Palm Pilot.
Microsoft introduced Clippy, an animated paper clip, to assist users of its Office products.iv
Conversational voice assistants including Siri and Alexa populated billions of pockets and kitchens worldwide.
Microsoft, Google, Meta and Amazon incorporated gen AI into their consumer and office products, improving the capabilities of virtual assistants.
Privacy-enhancing technologies, or PETs, let organizations analyze and extract insights from sensitive datasets without revealing the nature or details of the data itself — even to the analysts involved.
Homomorphic encryption: This technique allows data to be processed while still encrypted. Computations are performed on the encrypted data; only after this are the results decrypted. This method is particularly impactful in safeguarding financial and healthcare data.
KEY PET TECHNIQUES
Safe access to sensitive data
Synthetic datasets: Created to be statistically similar to original datasets, synthetic data can be used for software testing or AI model training without risking the exposure of sensitive information.
The advancement of PETs, particularly homomorphic encryption, could transform data privacy and security, offering a better way of handling sensitive information.
Medical research: Advancing medical research and patient care without compromising patient privacy.
POTENTIAL AND APPLICATIONS
Financial sector: Developing anti-money laundering tools without accessing individual bank accounts.
Computational demands: Homomorphic encryption currently requires significant computational effort. This has been a barrier to its widespread adoption.
CHALLENGES AND FUTURE DEVELOPMENT
Upcoming innovations: By 2024, several companies are expected to test or commercialize chips that accelerate homomorphic encryption, making computing on encrypted data almost as efficient as on unencrypted data.
Autonomous vehicles: Allowing self-driving cars to learn from shared data without revealing specific driver information.
Business analytics: Gaining insights about business operations without intruding on customer privacy.
Click here to view Mastercard’s seven data responsibility principles
The principle of least privilege: The idea that network users require only so much access to fulfill their duties (in contrast to the older perimeter model, where any user who could get inside the enclosure, so to speak, was essentially free to wander around).
ZTA's key pillars:lx
Traditional cybersecurity was premised on building walls around computing environments. But an organization's digital environment often has numerous components in different places, making it hard to enclose with a single perimeter. The shift to the cloud means a critical part of any organization's resources is offsite. Also offsite are remote workers, some using personal laptops and phones. Then there are third-party service providers, a range of IoT sensors and devices at the edge, and customers who require access to company systems.
Given such sprawling environments, single firewall defense requires updating. The need for a security paradigm better suited for a distributed computing era has given rise to zero-trust architecture (ZTA). It's a concept for security that either dispenses with or downplays the importance of the perimeter in favor of defending every network resource or node.
The shift to zero-trust
Segmentation: The division of environments into discrete zones, the better to minimize any "blast radius" caused by a potential breach and to prevent lateral movement by any bad actor who does manage to get in.
Persistent validation: Applications and end devices such as laptops, and especially users, are constantly tasked to verify their identities.
30% fewer security issues
Forrester research indicates that companies that have instituted ZTA see 30% fewer security issues; those that they do undergo are 40% less serious.lxi
Difficulties in adopting and integrating advanced data protection strategies hinder progress.
Bear Case
Robust, cost-effective and widely adopted data security measures drive innovation and trust in the digital economy.
Bull Case
Plausible
futures
The role of data responsibility in enterprise management is becoming increasingly crucial. Businesses must balance the need to use data in order to grow against the ethical imperatives of data management. As technology advances, so too will strategies for protecting, managing and responsibly using data. This includes fortifying defenses against breaches and ensuring data practices align with societal values and individual rights. The future will likely see heightened investment in technologies and processes that strengthen data security, enhance privacy and build trust between companies and their stakeholders.
return to contents
One of the most compelling aspects of these emerging tech trends is their intersection, as they amplify each other's impact. The convergence of AI, computing and data technologies is driving innovations far beyond the tech sector — transforming finance, retail, healthcare, education and more. AI enhances data analysis and decision-making, while compute power enables faster AI training and inference. Spatial computing benefits from AI-driven object recognition and data becomes more valuable as it feeds into AI algorithms.
"We have waited for this moment where the data and the compute and the GPUs and all the technology come together, and I would say it’s probably the single most exciting period in technology in decades.”
Safra CatZ
| Oracle CEO lxvii
AI
Data
AI's effectiveness heavily depends
on the quality and accessibility of data. Through tokenization and stringent data management, data integrity and quality are improving for future AI training and inference processes. AI algorithms are increasingly effective in analyzing data, identifying patterns and detecting anomalies, contributing to enhanced data integrity and more insightful analytics.
AI
Compute
The fusion of AI with advanced computing resources is enhancing the capabilities of AI models. High-performance computing and specialized hardware like GPUs are pivotal in training expansive and complex AI models. Concurrently, AI is transforming the computing infrastructure, optimizing and automating resource allocation andworkload management. This synergy enables more efficient processing, leading to quicker and more accurate AI-driven insights.
Data
Compute
As data volumes grow exponentially,
the role of computing capabilities in data processing becomes more crucial. This synergy is particularly evident in areas like real-time analytics and machine learning algorithms, where the need for immediate and efficient data processing is paramount. Advances in computing technologies are enhancing data processing speeds to enable the extraction of insights from large datasets with greater efficiency.
+
+
+
Unstructured data has exploded online, especially in social media. In 2022, it accounted for 90% of data created.lxvi But because older forms of AI cannot process it, it's been underutilized for analytical purposes. As generative AI can handle it quickly, that could change, and massive quantities of hitherto under-utilized information could lead to new insights.
The market value of AI in biotech is expected to grow by almost 30% annually through 2032.lxiv The most exciting of gen AI's possible applications in biotech is the costly and time-intensive drug discovery process. It could prove effective there for its data-processing abilities, its utility in creating new pharmaceutical molecules and in protein engineering, and its help in modeling the potential outcomes of clinical trials — among other use cases.lxv
Amazon Web Services and NVIDIA, the hardware company whose GPUs and CPUs have been vital to AI, are collaborating to create next-gen cloud infrastructure and software to power ongoing breakthroughs in gen AI lxi by facilitating foundation model training and app development.lxiii
Unstructured data 2022
90%
Social media’s share of total data created in 2022
The future of tech convergence
The convergence of these technologies represents a shift in how technology is applied across sectors.
The productive interaction of AI, computing and data could have material consequences for how people live daily — how we shop, work, play and interact with each other.
These emerging technologies will enable banks to analyze vast amounts of financial data more efficiently, leading to better risk assessment and fraud detection and more personalized customer services. Banks can leverage AI to optimize their operations and offer more sophisticated and inclusive financial products to their customers.
Merchants can utilize these advances to enhance customer experiences, from personalized marketing to streamlined supply chain management. AI-driven insights can help merchants better understand consumer behavior, optimize inventory and predict market trends.
As these trends develop, they promise to keep driving growth and transformation in the digital era. This synergy between AI, compute power and data could unlock new efficiency, innovation and capability. The challenge for industries will be to strategically navigate this convergence and ensure that the integration of these technologies aligns with broader business goals.
Ultimately, trust is the critical differentiator needed to ensure these trends progress smoothly — trust that all AI is responsible AI, that consumers and regulators can keep pace with change and understand its benefits, and that merchants, banks and tech companies are equipped to protect their customers’ data and digital assets.
At Mastercard, we're already working with these emerging technologies, employing them to safeguard the more than 125 billion transactions on our network every year. Our teams of thousands of AI engineers, data scientists and technologists are committed to developing practical solutions that integrate privacy and ethics by design and adhere to the highest standards in security. Across our capabilities — data intelligence, open banking, identity, fraud protection and cybersecurity — Mastercard ensures trust is at the forefront and technology is used responsibly and ethically. By embracing technological change while addressing its challenges, we can step into the future of commerce and ensure technology influences the world for the greater good.
return to contents
Signup for Mastercard Signals
hide sources
AI-powered assistants could streamline the shopping process for consumers, surfacing appropriate products and accelerating checkout. Shopping Muse, a personal retail assistant from Dynamic Yield, a Mastercard company, combines conversational capabilities with personalization and recommendation capacities to facilitate the journey from product discovery to sale. Additionally, Shopify, Instacart, Mercari, Carrefour and Walmart are testing chatbots that may become widely available next year.
Tech may bring the travel agent back in virtual form. A business traveler or other user could prompt a bot on TripAdvisor or Kayak to make a booking and then leave it to gen AI to reconcile different trip elements — from flights to lodging to dining to sightseeing and beyond, across time zones and currencies — and produce an itinerary in minutes rather than hours.
Companies including Microsoft, Google, Meta and Amazon are integrating gen AI assistants into their consumer and office products, expanding the use of this technology. In organizations, these next-gen assistants could perform communication tasks for individuals, from crafting email responses to scheduling meetings. In customer service, they could turn chatbots into practical tools that resolve clients' problems and prioritize jobs to be done.
Code generation
Code documentation
45%-50% time savingsx
Emerging use cases
Code refactoring
Inclusive development
Gen AI can automate the creation of vital documentation that explains how code was built and how to work with it.x
35%-45%xi
Models can leverage an organization's code repository to automate subsequent code creation and test generation.
A code-writing solution could also work in response to natural language prompts.xi
20%-30%xii
Reworking code while not changing its function is both labor-intensive and indispensable. Gen AI could do this work, especially in industries whose systems run on legacy code.xii
Gen AI could propel the low-code/no-code movement, making software development more accessible to those without specialized training.
Entertainment
Disinformation campaigns
Education
Gen AI could create lifelike, responsive avatars for education and job training. In the former, students might find themselves interrogating historical figures about world-changing events. In the latter, service industry trainees could practice customer relations with AI-generated avatars. In finance, both bank trainees and retail investors could use gen AI-created environments to practice complex trading functions.
Emerging use cases
The film and TV industry increasingly uses deepfake technology to enhance visual effects. AI-created content is becoming more democratic due to readily available tools like RunwayML for video, Midjourney for high-quality still images and Pixabay for music. The rapid progress has led to concerns among screenwriters over whether Hollywood might start using AI-generated scripts and actors to cut costs and simplify production. This issue became a major source of contention in the 2023 guild strikes against studios.
Recent armed conflicts have been test cases for gen AI-driven disinformation campaigns. In the future, military decision-makers will have to consider gen AI’s ability to create, in quantity, fraudulent content that could shape the global strategic information environment. In politics, deepfakes could impact elections.
Training
Enhanced decision-making
Commerce
When Kellogg needed to determine the best spots on store shelves to display new products, it created a virtual supermarket. Focus group shoppers wandered the aisles via headsets that tracked their eye movements. The data collected informed decisions about in-store product placement — boosting sales by 18%.xxx
Over 200,000 Bank of Americaxxix employees worldwide have trained with AI and immersive VR technology that simulate challenging customer conversations and other realistic scenarios.xxix
Spatial computing interfaces could improve the consumer experience by overlaying product information on in-store products. The apparel industry could benefit from this tech, with virtual try-on solutions, digital twinning and virtual fashion shows becoming common. Luxury goods company LVMH has partnered with Fortnite creator Epic Games to develop immersive shopping experiences.xxviii
Emerging use cases
3D visualization
Gaming
Researchers at the University of Alberta in Canada developed ProjectDr, an AR platform that projects patients' diagnostic CT images, reconstructed for 3D use, onto their skin. This allows surgeons wearing MR goggles to visualize operations and helps clinicians perform procedures such as spinal manipulation more effectively.xxxi
Gaming’s heavy reliance on virtual reality has made it a pioneer in spatial computing. Apple’s Vision Pro Headset, a “spatial computer,” will stimulate further extended reality innovation in the gaming space.
Smarter retail
Smarter vehicles and cities
Smarter homes
Advances in network technologies are crucial in self-driving cars, where real-time data processing is essential. Network tech is also important in other aspects of the driving experience, such as in-car payments, where Mercedes-Benz has been an innovation leader. These technologies are also foundational in developing smart cities, where intelligent, interconnected systems enhance urban life.xxxx
UWB technology may usher in highly personalized, unattended retail experiences where shoppers are identified when they enter the store, guided to relevant products and promotions, and pay without checking out or opening a wallet.
The Matter and Thread standards could improve how smart home devices interact, creating a cohesive and interoperable ecosystem. This includes seamless integration of devices into a single application framework, overcoming previous incompatibilities. Matter can enable one brand’s security camera motion detector to activate another’s lights or a door lock to trigger other brands’ blinds, bulbs or thermostats.xxxix
Emerging use cases
Scientific research
Healthcare
Advanced AI and machine learning
Powering advanced diagnostic tools, personalized medicine and drug discovery.
Enabling more accurate simulations and models in climate science, physics, biology and other fields.
Leveraging increased computational power to train more effective AI models capable of handling complex tasks and large datasets.
Emerging use cases
Financial sector
Using computing power for complex financial modeling, risk analysis, fraud detection, FX settlement and real-time transaction processing.
Biometric security
Consumer loyalty programs
Digital identities
Transforming loyalty rewards into digital tokens could make them easier to move across various loyalty platforms, enabling seamless earning and redemption across an interoperable loyalty ecosystem, enhancing customer engagement and satisfaction. This interoperability could improve small merchants' ability to compete.
The tokenization of biometric data, like fingerprints and facial traits, adds an extra layer of security to devices and systems, preventing unauthorized access and enhancing user convenience.
Tokenization of personal data gives individuals more control over their digital identities, enhancing security and privacy in online transactions and interactions.
Emerging use cases
Mastercard offers several identity, biometric checkout and loyalty solutions. Click here to learn more.
We are moving toward a future in which nearly anything of value can be represented as a discrete digital token — stocks, bonds, real estate, digital assets like in-game items and currencies such as stablecoins. Here are three uses cases with near-term potential:
Customer experience optimization
Operational efficiency
Market analysis and trend prediction
Identifying opportunities and predicting market trends through data analytics.
Emerging use cases
Risk assessment and management
Credit scoring
Enhancing the customer journey with personalized interactions informed by data.
Analyzing vast data resources, including previously untapped unstructured data streams, to make scoring more precise — or create new scoring models that widen the pool of credit recipients.
Data-driven optimization to improve processes and reduce costs.
Predictive models for identifying risks and formulating mitigation strategies.
Responsible AI governance
Data consultancy services
New approaches to data access
Privacy compliance
Ensuring users have access to and understand the appropriate data to promote collaboration and prevent unauthorized access is important. Data cataloging and metadata management are key practices here. A centralized repository of data assets and their metadata enables users to discover, understand and use data efficiently.
Both start-ups and legacy players are expanding into offering end-to-end consultancy services on data-related matters, including governance and privacy.
Start-ups are arising to fill the market need for technology that helps companies ensure that they are using AI responsibly — safely and in compliance with evolving regulation.
Non-compliance with global data privacy regulation can be costly. A number of companies are cropping up to provide technology that makes compliance easier.
Emerging use cases
There is an increasing enterprise focus within each of these areas, which should improve capabilities:
Here are some of the ways these trends intersect:
Beyond code generation, AI copilots help with various stages of the software development lifecycle, including debugging, testing, documentation and even the conceptual stages of software design. Some tasks see striking time savings when engineers deploy gen AI tools to accomplish them:
i
ii
iii
iv
v
vi
vii
viii
ix
x
xi
xii
xiii
xiv
xv
xvi
xvii
xviii
xix
xx
xxi
xxii
xxiii
xxiv
xxv
xxvi
xxvii
xxviii
xxix
xxx
xxxi
xxxii
xxxiii
xxxiv
xxxv
xxxvi
xxxvii
xxxviii
xxxix
xl
xli
xlii
xliii
xliv
xlv
xlvi
xlvii
xlviii
xlix
l
li
lii
liii
liv
lv
lvi
lvii
lviii
lix
lx
lxi
lxii
lxiii
lxiv
lxv
lxvi
lxvii
lxviii
Strict data governance: Another critical component of ZTA.
Keeping pace with technology
Rising security and compliance costs
Balancing accessibility and security
Global data regulation compliance
Sophisticated threats
As technology evolves, ensuring that data protection and management measures are
up-to-date and can handle new data and storage methods requires constant effort.
The financial burden of maintaining state-of-the-art data security measures can be significant, particularly for smaller organizations.
Data must be kept secure, but it must also remain accessible to authorized personnel. Overly restrictive data policies can hinder operational efficiency and innovation, but insufficient ones could make breaches possible.
Complying with international data-related regulations (like the GDPR, the CCPA and the EU AI Act) is complex for both multinational corporations that work across jurisdictions and small and medium-sized companies with limited resources and in-house expertise. Non-compliance can lead to hefty fines and reputational damage.
Developing effective security measures to protect against advanced malware, ransomware and phishing attacks is increasingly challenging as bad actors improve their capacities.
Challenges to navigate
Sophisticated threats
Global data regulation compliance
Balancing accessibility and security
Rising security and compliance costs
Keeping pace with technology
Jump to section
9. Protecting Data
The responsible enterprise
8. INSIGHT FACTORIES
Leveraging deeper intelligence
Data
In Section 3, we focus on the evolving landscape of data management. Here, we review how tokenization is broadening the utility of data in areas such as identity and loyalty programs. This section also explores the growing reliance on advanced analytics and AI in decision-making, underscoring the need to protect data and embed ethical management principles.
7. Tokenized Value
Expanding the utility of data
4. SPATIAL INTERFACES
Moving beyond the screen
1. Personal copilots
The rise of gen AI assistants
150+
Number of current copilot platforms across categories such as software engineering, productivity, healthcare, design and human resources.
Ensuring AI decisions are responsible — transparent, fair and unbiased — is crucial. Outputs must be carefully managed to avoid reflecting, or even amplifying, societal and underlying data biases.
Challenges to navigate
Privacy and data security
AI assistants may require access to personal data for some use cases. Safeguarding this data and ensuring user privacy is paramount.
User trust and acceptance
Building trust is essential given concerns about AI replacing humans in the workforce and knowing too much about individuals.
Technical limitations
Despite advances, AI assistants are still evolving and may face challenges in understanding context or managing complex tasks.
Intellectual
property
Gen AI models train on vast amounts of internet data, some of which is copyrighted. Users of gen AI outputs could unwittingly expose themselves to legal issues. In December 2023, The New York Times became the first major U.S. media corporation to file a gen AI-related copyright infringement suit, against OpenAI and Microsoft.
Personal copilots
Generative AI could change software development, enhancing productivity and innovation by automating standard developer tasks. Gen AI applications can assist with legacy code and support new product development — leading to 20% to 45% higher engineering productivity, according to McKinsey.viii While these apps bring efficiency and automation, human oversight remains critical, particularly in monitoring for and correcting biases and other errors that AI might introduce.
Software copilots can assist software engineers in writing source code, designing software architecture, testing, understanding existing code (essential for code maintenance) and more creative tasks such as UI/UX design. These emerging AI tools understand natural language inputs, so developers can describe the code they need in simple terms and leave it to the AI to write that code in the desired programming language. Non-engineers could to some extent do some light engineeering work, making it possible for organizations to potentially redesign their dev processes.
Tech drivers
git hub
GitHub Copilot is a prime example of this technology in action. It creates code snippets based on software developers' natural language descriptions. A study by GitHub found that developers using Copilot could complete a coding task 55% faster than those not using it.ix
Emerging use cases
Avoiding over-reliance
As developers gain confidence in AI-generated code, they may be tempted to "cut and paste" it directly into their codebases, which could have negative results. Gen AI tools require human oversight, and hand-coding will remain necessary.
Challenges to navigate
language limits
It still needs to be determined whether developers can express complex algorithms precisely enough in natural language for an AI to generate appropriate code.
Intellectual Property dilemmas
IP issues could arise as AI coding assistants possibly use bits of copyrighted code in their training. A class-action lawsuit is pending against developers’ platform GitHub, its owner Microsoft and OpenAI for using platform users’ code and other materials in creating GitHub Copilot.xiii
Integration with existing systems
Integrating AI tools with existing development environments and workflows can be complex and may require significant adjustments.
Potentially sub-par and unreliable AI-generated code
Ensuring the code generated by AI is reliable, efficient and bug-free requires human oversight. There's also the risk of developing functionally correct code that is inefficient or under-optimized for its specific use case.
Slow adoption in large enterprises due to IP risks and integrationchallenges related to data efficacy.
Bear Case
A global shift towards nearly complete AI-powered software development, with AI-first approaches being taught inuniversity programs.
Bull Case
Plausible
futures
150+
Number of current copilot platforms across categories such as software engineering, productivity, healthcare, design and human resources.
2021
2022
2023
92%
A recent study conducted by GitHub and Wakefield Research sheds light on the impact of AI on the developer experience.
The survey of 500 U.S.-based developers from companies with 1,000-plus employees revealed that 92% already use AI-powered coding tools, signaling a significant shift in the field.xv
Enhancing code
AI tools that generate highly realistic media — audio, video, images and text — are increasingly used to create "deepfakes" for fraudulent activities, scams and disinformation campaigns. This has ignited a battle between cybercriminals on one side and companies and governments on the other. The rise of malicious deepfakes has increased the market for detection tools and spurred new regulations, emphasizing the importance of data provenance and AI's role in bolstering cybersecurity responses.
Deepfakes can be dangerous, and caution is warranted. Yet concerns shouldn't overshadow the fact that gen AI-powered representations can also have productive uses in entertainment, education, training and other areas where there is a continual need for high-quality textual, visual and other content.
Fraudulent and misleading content items have always existed, but now there are more tools for making them and more ability to make them at scale. Deepfakes are typically created using a type of machine learning architecture known as a generative adversarial network (GAN). A GAN involves two competing neural networks: a generator and a discriminator. The generator creates the content, while the discriminator evaluates its authenticity. In essence, the generator attempts to fool the discriminator. Continuous feedback and refinement make the generator increasingly adept at producing realistic and convincing media — and the discriminator increasingly adept at detecting whether the media is real or fake. This constant feedback loop enhances the realism of the generated media, presenting both challenges and opportunities.
Tech drivers
Generator vs. discriminator
How a GAN workslxviii
Emerging use cases
46%
46% of businesses have been targeted by identity fraud using deepfakes.xvii
37%
37% of them were targeted by deepfake voice fraudxviii and 29% by deepfake videos.xix
42%
The deepfake detection market is expected to grow 42% annually through 2026.xx
Lack of industry standards
Given the range of threat sources, global coordination is required to develop content authentication approaches.
Challenges to navigate
Advancement of deepfake tech
Keeping up with the development of deepfake technology is a constant challenge for makers of detection tools.
Lack of public awareness
Increasing public education about deepfakes is vital to mitigate risks.
Accuracy issues
Achieving high accuracy in detection is crucial to avoid the consequences of false positives or negatives.
Widespread deepfakes disrupt various sectors, from news to e-commerce. In response, organizations slow their use of gen AI while the regulatory response stifles innovation.
Bear Case
Industry and government successfully establish clear guidelines and standards to manage deepfake content. Innovators develop foolproof tools to identify fraudulent material.
Bull Case
Plausible
futures
Advances in chip architecture, cloud computing, connectivity, mixed reality devices and processing at the edge have the potential to transform financial services, consumer experiences, healthcare and more. In this section, we examine the impact of spatial computing, smarter networks and increased computational power.
Race to detect
Spatial computing could transform human-machine interaction, enabling people to use hand gestures and eye movements to engage with virtual objects, images and applications displayed across their fields of vision through headsets and smart glasses. This can make possible immersive mixed-reality experiences that blend the digital and physical worlds and deliver an unprecedented level of authenticity — as if the user were inhabiting a movie. As spatial computing technology matures over the next several years, it could integrate into our daily lives, transforming shopping, commerce, education, healthcare, manufacturing, entertainment and more.
Currently, spatial computing is used in systems such as GPS navigation, location tagging, ridesharing apps and smart devices controlled through virtual assistants. It also plays a pivotal role in virtual, augmented and mixed reality (VR, AR, MR) platforms. The next wave features sophisticated headsets and smart glasses, enhancing everything from educational simulations to collaborative telepresence and interactive entertainment.
Tech drivers
Emerging use cases
Content creation and ecosystem development
Given the range of threat sources, global coordination is required to develop content authentication approaches.
Challenges to navigate
Health and safety concerns
Societal unease with immersive technologies may have played a role in their failure to achieve mass market success by now. They have been blamed for problems including obesity, myopia, depression, social atomization, dissociation from reality and beyond. Regulatory and industry guidelines on the proper use of these tech tools could emerge.
Hardware limitations
Creating lightweight, comfortable, unobtrusive, affordable and stylish hardware for mass market adoption has been problematic. A new crop of devices still has far to go in miniaturization and wearability. Numerous devices, including Google Glass and Magic Leap headsets, have struggled to gain wide adoption.
The data privacy issue
Spatial computing platforms and tech vendors will likely have access to novel types of user information. Users’ physical movements could betray sensitive health information. Data from eye-tracking solutions could indicate what users look at. This could deepen the already charged societal conversation about data privacy and collection.
1. Content, experiences and social interactions
Spatial interfaces
Advances in network technologies are ushering in a new era of automation, interconnectivity and intelligence. Improvements in architecture, standards and data transmission could bring more intelligent experiences to stores, offices, homes and industrial settings. Our world will become more “instrumented,” meaning that we’ll be increasingly able to engage and interrogate it from any device (including new spatial devices), at any time, in any place. Shopping journeys could be more contextually relevant and guided, while experiences involving the mass movement of people, such as public transit and stadium entrances, could become automated. Of course, progress will be needed on multiple fronts — technology, interoperability, consumer buy-in — so advances will likely occur in intermittent steps.
Emerging use cases
Challenges in achieving interoperability due to service and standard fragmentation.
Bear Case
Enhanced connectivity and interoperability across various sectors, driving intelligent experiences.
Bull Case
Plausible
futures
Integration and maintenance
Bridging intelligent networks with legacy systems and maintaining the resultant stacks are complex tasks.
Challenges to navigate
Interoperability standards
Ensuring that different devices and systems can communicate seamlessly is a challenge, particularly given the vast array of manufacturers and standards.
Security and privacy
Interconnected devices that collect and transmit data can capture vast amounts of information about people and serve as targets for hackers. Edge devices like sensors integrated into Internet of Things (IoT) systems can be particularly vulnerable.
The digital divide
Disparities in access to advanced network technologies across different regions and socio-economic groups could lead to unequal development. They require consideration.
Connected tech
Emerging use cases
Shortages
Demand has led to periodic GPU shortages, leaving businesses to wait.
Challenges to navigate
Energy consumption and sustainability
More computing power means more data centers, which are energy intensive. Now responsible for an estimated 3% of the world's energy use, data centers could account for 4% by the decade's end.xliv Green computing and data center practices, which stress circularity and efficiency in energy use and materials, will be necessary to counteract this demand.
Hype and excessive optimism
New technologies like quantum may take longer to develop than expected given the hype surrounding them. Unrealistic expectations could lead to disillusionment and potentially less funding for research and development.
Thermal management
Dealing with the heat generated by high-performance computing so it does not impact performance and longevity is a persistent challenge.
Resource scarcity leads to slower development of new software capabilities.
Bear Case
Advances in green computing enable development of innovative, sustainable applications.
Bull Case
Plausible
futures
Incompatibility and competing standards hinder interoperability and the widespread adoption of tokenization.
Bear Case
Tokenizing loyalty and digital identity leads to global acceptance, driving innovative applications.
Bull Case
Plausible
futures
Innovations in data technology are helping enterprises build their brands, seize competitive advantage and strengthen data security and consumer privacy. This section addresses the tokenization of data to expand its utility, the use of advanced analytics to unearth game-changing business insights and emerging methods to protect data.
In data security, tokenization is a means of safeguarding sensitive information, such as a credit card number, by replacing it with a unique, random code (called a token). But tokenization can also enable digital representations of assets — real estate, stocks, intellectual property and other sources of value — on a blockchain or network to make it transferable, tradeable and secure. This technology is being applied to various data types, from biometric attributes for identity authentication to customer reward points for easier redemption, paving the way for new data interoperability and programmability standards.
Power surge
Emerging use cases
Regulatory compliance
Navigating the complex and evolving regulatory landscape surrounding digital assets and tokenization is a significant challenge. Jurisdictions have varying regulations regarding using and trading tokenized assets, impacting their adoption.
Challenges to navigate
Security risks
While tokenization enhances security, it's not immune to cyber threats. Smart contract vulnerabilities, blockchain platform security breaches and the risk of token theft remain concerns.
Public perception
Tokenization and digital assets are complex matters that can be misunderstood. A lack of understanding by users can hinder adoption and trust.
Data privacy
There are concerns about how tokenized personal information is used and stored and who has access to it, especially in the case of biometrics.
Companies have always tried to glean meaning from data. But now, with magnitudes more data available to them, they’re leveraging advanced analytics, machine learning and AI to extract deeper insights, improve decision making and transform the business landscape. Organizations with rich data of their own can combine it with other datasets to produce new knowledge. Gen AI is further improving data analysis by allowing businesses to analyze previously inaccessible unstructured data, including social media posts and multimedia content. This capability could lead to a paradigm shift in data mining and offer a competitive advantage to organizations with large data repositories.
Tokenized value
Slow adoption beyond niche users (like gamers). A precedent would be the metaverse, adoption of which has been slow, due to cumbersome hardware and other factors.
Bear Case
Breakthroughs in the design and adoption of headsets and other hardware lead to widespread use in retail and entertainment, transforming those sectors.
Bull Case
Plausible
futures
Data quality and accessibility challenges impede insightful analytics.
Bear Case
Advancements in technology and infrastructure usher in a new era of data-driven enterprise insights.
Bull Case
Plausible
futures
Data quality and integrity
Data subject to analysis must be accurate, complete and reliable. Poor quality data can lead to misleading conclusions and flawed decision-making.
Challenges to navigate
Complexity
of data integration
Assimilating data from various sources and formats and ensuring that it is harmonized and consistent for analysis can be complex and resource-intensive.
Responsible use of data
Responsible data collection and utilization practices are key, especially when it comes to personal data used for predictive analytical purposes.
Implementation and maintenance costs
Establishing and maintaining an adequate data analytics infrastructure can be costly, especially for small to medium-sized enterprises.
Skill gaps and talent
shortages
In some markets, there could be a shortage of skilled professionals in data science, analytics and AI, which can hinder the development and operation of insight factories.
Emerging use cases
In the digital age, the emphasis on responsible data management is reshaping enterprise strategies. Companies increasingly recognize the importance of data integrity, protection and privacy, particularly in light of sophisticated data breaches and reported misuse of data. Organizations that want to earn consumer trust will not only comply with regulations but work to anticipate changes in the broader data responsibility environment before they happen.
Insight factories
Difficulties in adopting and integrating advanced data protection strategies hinder progress.
Bear Case
Robust, cost-effective and widely adopted data security measures drive innovation and trust in the digital economy.
Bull Case
Plausible
futures
Sophisticated threats
Developing effective security measures to protect against advanced malware, ransomware and phishing attacks is increasingly challenging as bad actors improve their capacities.
Global data regulation compliance
Complying with international data-related regulations (like the GDPR, the CCPA and the EU AI Act) is complex for both multinational corporations that work across jurisdictions and for small and medium-sized companies with limited resources and in-house expertise. Non-compliance can lead to hefty fines and reputational damage.
Balancing accessibility and security
Data must be kept secure, but it must also remain accessible to authorized personnel. Overly restrictive data policies can hinder operational efficiency and innovation, but insufficient ones could make breaches possible.
Rising security and compliance costs
The financial burden of maintaining state-of-the-art data security measures can be significant, particularly for smaller organizations.
Keeping pace with technology
Challenges to navigate
As technology evolves, ensuring that data protection and management measures are up-to-date and can handle new data and storage methods requires constant effort.
Emerging use cases
One of the most compelling aspects of these emerging tech trends is their intersection, as they amplify each other's impact. The convergence of AI, computing and data technologies is driving innovations far beyond the tech sector — transforming finance, retail, healthcare, education and more. AI enhances data analysis and decision-making, while compute power enables faster AI training and inference. Spatial computing benefits from AI-driven object recognition and data becomes more valuable as it feeds into AI algorithms.