The next decade is likely to be defined by an unusual combination: rapid innovation that promises real productivity gains, paired with new forms of fragility that will test corporate strategy, regulatory capacity and cyber defenses.

The technological base of economic activity is shifting in ways that feel structural rather than cyclical. Intelligent systems – combining advanced robotics, AI and data-rich operations – are beginning to run factories, hospitals, logistics chains and financial networks. Done well, this could unlock substantial efficiency and capacity. Done badly, it could hardwire complexity into critical operations, creating ecosystems that are harder to govern, harder to secure and more sensitive to geopolitics. Investors and corporate leaders are therefore asking the right question: not simply what is possible, but how quickly these shifts will arrive, in what sequence and with what second-order consequences for competition and value creation.

The firms that thrive through 2035 are unlikely to be those that treat this as a frantic grab for growth. The winners will treat the next decade as a long, disciplined campaign: careful capital allocation, hard-nosed operational learning and a sober understanding that the new upside comes with new tail risks.

From fixed tools to general-purpose labor

Robotics used to mean single-purpose machines, bolted to the floor, repeating a task for years with minimal variation. That world is fading. More capable hardware, fused with AI – which is expected to meet and exceed human-standard problem-solving benchmarks by the end of 2028, if not before – is enabling a new class of polyfunctional robots that can switch tools and tasks without being redesigned from scratch.

The growth projections tell their own story. Forecasts suggest robotics shipments will rise at around 15% a year leading up to 2030. By 2028, leading platforms are expected to handle ten or more distinct tasks. By 2035, that figure could exceed fifty. One machine might weld, inspect welds, move parts, tighten bolts, check torque, monitor wear and carry out basic repairs, then improve over time via software updates delivered over the air. In effect, robots begin to resemble a form of general-purpose labor: not a single tool, but a flexible worker with a growing repertoire.

This shift will not be confined to manufacturing. In logistics, robots will increasingly pick, sort, restock and inspect without halting operations, turning warehouses into near-continuous systems that can run around the clock. On construction sites, mobile platforms will take on surveying and routine repairs. In agriculture, field units will plant, harvest and control pests with a precision that reduces waste and cuts chemical use.

The economics are already visible in places where downtime is punishingly expensive. In automotive plants, unplanned line stoppages can destroy roughly $2.3m of output per hour. Intelligent machines that combine predictive maintenance with physical intervention change that calculus. Fewer shocks, shorter outages and a faster payback period on automation spend can shift boardroom attitudes from “interesting” to “inevitable,” especially when labor constraints and quality targets tighten simultaneously.

Services will not be spared, or deprived, of the same forces. In one Foxconn trial involving a collaborative nursing robot, non-clinical workloads were cut by roughly one-third through the ferrying of supplies, patient monitoring and documentation handling. Set that against a projected global nursing shortfall of 4.5 million by 2030 and the attraction is obvious. Hospitals gain capacity without equivalent increases in headcount, while nurses regain time for direct patient care. Similar patterns are likely across hospitality, retail and facilities management, where routine tasks absorb a large share of the labor bill and where incremental automation can translate into measurable margin expansion.

For investors and M&A dealmakers, the implication is not just that automation markets will grow. It is that operating models will change. When robots become adaptable and software-defined, the boundary between capex and opex blurs, switching costs fall, procurement shifts toward platforms and the returns accrue disproportionately to firms that can integrate, train and iterate quickly. That, in turn, reshapes what “quality” looks like in diligence: the prize is not simply a robot footprint, but proof of repeatable deployment, reliable uptime and defensible data advantages.

Quantum: threat and opportunity in the same package

As digital infrastructure becomes central to commercial and public services, security in the cybersphere shifts from an IT concern to a strategic one. Machines capable of breaking widely used public-key schemes are expected within the next decade, making the window for replacing vulnerable systems uncomfortably short. In markets, timing matters: the day a scheme is breakable is not the day the damage begins. The real risk is “harvest now, decrypt later,” where sensitive data is stolen today and cracked when quantum capability matures.

Innovation is gathering pace in response. The most advanced answer is quantum key distribution, which uses properties of quantum physics to create communication links where any attempt to eavesdrop becomes detectable. Quantum-safe cryptography is likely to become the norm for critical communications by 2035, reinforced by regulators that insist on quantum-resistant measures for sensitive data and infrastructure.

For transaction-heavy businesses such as banks and exchanges, processing trillions of dollars in flows each day, this transition has two distinct dimensions. First, they must upgrade networks and software to preserve confidentiality and integrity as quantum capabilities spread. Second, they can use quantum optimization models to fine-tune supply chains, routing and inventory management in ways that classical systems struggle to match. Even as AI hype dominates headlines, the surrounding cyber and computational infrastructure is evolving quickly and quietly. Business decision makers cannot afford to watch only the most visible wave while ignoring the undercurrents that determine resilience, compliance and cost.

For corporate strategists, this duality should influence both risk management and opportunity capture. Quantum preparedness will increasingly feature in buyer scrutiny, regulatory engagement and insurance conversations. At the same time, early advantages may accrue to firms that can translate quantum experimentation into practical optimization gains, especially in networked industries where small improvements compound across the system.

Invisible constraints: power, water and geography

The vision of a robot-run, AI-forward future – with machines tilling fields and algorithms orchestrating the internet – assumes abundant computing capacity. Reality is less generous. Data centers demand large volumes of electricity and water for cooling. In several regions, power grids and local water resources are already limiting how quickly new facilities can be built and how intensively they can be operated. For technology providers, compute capacity becomes a scarce input that must be rationed between training huge models and running customer workloads. Scarcity, in turn, shapes pricing power, customer experience and the economics of scaling.

At the same time, trade policy and geopolitical rivalry are reshaping where digital infrastructure and advanced manufacturing can sit. Export controls on cutting-edge chips, restrictions on foreign ownership of sensitive assets and data localization rules are pushing companies toward more regionalized technology footprints. The practical consequence is that “global” increasingly means “federated” rather than “centralized,” with duplication of systems, complexity in governance and new dependencies on local partners and policymakers.

Boards and investors who treat infrastructure and policy as side issues risk mispricing both upside and downside. As 2035 approaches, the value of a technology asset will depend more on physical location, regulatory exposure and access to stable power and water – not just the elegance of its code. The next generation of moats may look less like patents and more like permits, grid access and trusted operating relationships in the right jurisdictions.

Disciplined adoption, not blind acceleration

How should tech-forward firms respond today? The safest route through a volatile transition is structured experimentation. Pilot projects should measure not only output and error rates, but total cost of ownership: maintenance, software development, cloud usage and workforce training. Only when the economics are clearly attractive should wider rollout follow. This is not conservatism. It is the fastest path to scalable learning without turning the core business into a live beta test.

Organizations with meaningful automation exposure should also stress-test human-machine workflows regularly. The goal is to surface single points of failure and quantify the business impact of breakdown scenarios, so that each increment of automation increases resilience rather than quietly eroding it. Complex systems fail in complex ways, often at the seams: handoffs between teams, edge cases in data, exceptions in operations. Stress testing is how firms make those seams visible before they rip.

On cybersecurity, the move to quantum-safe infrastructure must begin well before quantum machines can threaten current schemes in practice. Early steps include cataloguing sensitive data and communication channels, then segmenting networks so breaches can be contained rather than cascading. Priority should go to financial transactions, industrial control systems and channels carrying intellectual property or strategic information. The logic is straightforward: protect the flows that, if compromised, would either stop the business or permanently damage its competitive position.

For investors, M&A dealmakers and corporate strategists, the overarching investment landscape will reward a particular kind of company. The most attractive businesses will demonstrate credible routes to profitability rather than relying on stories of endless expansion. Their technology plans will match realistic adoption timelines rather than marketing calendars. They will treat workforce development and cyber resilience as central management tasks, not compliance chores. In demanding enterprise settings, robust technological strategy is not a nice-to-have. It is the price of admission. The leaders of tomorrow are already building it and executing it.

Discover further insights

To learn more, download The Future of Tech, 2025–2035: Insights for Investors & Dealmakers, published in association with Sterling Technology – the provider of premium virtual data room solutions for secure sharing of content and collaboration for the investment banking, private equity, corporate development, capital markets and legal communities engaged in TMT M&A dealmaking and capital raising.