From Private Clouds to EdgeAI: An Expert IT Architect Shares Insights on His Landmark Projects and Emerging Trends According to Barclays, the percentage of executives planning to shift workloads from public to private clouds rose from 43% in the second half of 2020 to 83% in the first half of 2024.
By Swara Bajaj
Opinions expressed by Entrepreneur contributors are their own.
You're reading Entrepreneur India, an international franchise of Entrepreneur Media.
Corporations are reassessing their cloud strategies. According to Barclays, the percentage of executives planning to shift workloads from public to private clouds rose from 43 per cent in the second half of 2020 to 83 per cent in the first half of 2024. These figures were highlighted at the recent VMware Explore event.
One possible reason noted during the event is that the trend toward public cloud migration has led to "post-traumatic stress syndrome" among corporate IT specialists. They are dealing with the "three Cs" — cost, complexity, and compliance with regulatory requirements.
Anton Davidovskii, a solution architect at Broadcom.Inc with over 20 years of experience, explained the advantages and disadvantages of public clouds. He is also one of the top certified IT experts, being one of only 50 professionals in the world to hold two VMware Certified Design Expert certifications and a Cisco Certified DevNet Expert certification. Davidovskii spoke about the specifics of an IT architect's work, shared his experience in implementing his largest projects and named an important trend in IT today.
IT Architect — A Bridge Between Technology and Business
Every year, IT infrastructure becomes more complex due to several factors: the exponential growth of data volume, the shift from local solutions to cloud and hybrid environments, the spread of connected devices, and increased cybersecurity demands, among others. As a result, companies must coordinate multiple systems and platforms, seamlessly integrate legacy products with new technologies, and ensure timely updates.
This complexity is impossible without IT architects—specialists skilled in designing intricate, distributed IT infrastructures from hardware and software components. Their expertise is relevant across various industries. In finance, for example, they can create large-scale transaction and risk management systems, while in logistics, they can develop systems for tracking the movement of goods.
The main task of an architect is to transform the client's disparate ideas into a comprehensive and effective technical solution. They choose specific technologies and vendors, develop testing methodologies, and document processes. Essentially, IT architects bridge technology and business. They need technical expertise, analytical thinking, and communication skills to ensure that the IT infrastructure is stable, reliable, and aligned with the client's goals.
However, company protocols, requests, and constraints (financial, technical, regulatory) vary from project to project. Therefore, universal, packaged solutions don't exist — architects always work on the cutting edge of innovation.
From energy projects to to 5G Network Virtualization
There are no universities in the world that train IT architects. Mostly Computer Science specialists work in this field. Anton Davidovskii followed this path: he was educated in a computer science specialty, and after university he got a job at a research institute related to the energy industry. There he gained his first experience in implementing new solutions and supporting current ones.
An important milestone for Davidovskii was working for a major Russian integrator as a system architect. The position involved automating systems worth more than USD100 million. Tasks ranged from gathering customer requirements to designing and implementing IT solutions, as well as coordinating the activities of a team (10 to 30 people).
Finally, in 2018, Anton was invited to join VMware Inc. The company develops virtualization software and offers a range of solutions for optimizing cloud infrastructure. In 2023, it was bought by Broadcom Inc. During the years of work in the organization as an architect, Davidovskii implemented a variety of projects. His innovative solutions have been implemented by major technology companies.
In particular, he participated in the creation of a module as part of the VMware Telco Cloud platform. The module is used to manage telecom operator systems that run both in data centers and remote locations.
Many operators today face the problem of managing equipment on telecom towers. Due to their large number and remoteness, even installing an operating system on the servers becomes a difficult task. Previously, companies made do with vertically integrated solutions from specific vendors, but recently the multivendor model has become widespread. In this paradigm, there is no single management system for the various components.
Davidovskii, with the help of his team, created a system that automates the life cycle of a large number of geographically distributed servers: initial installation and configuration, upgrades, replacement of failed equipment, and so on. The team did a great job of selecting and refining tools and developing typical automation scenarios. VMware product is now used by many large telecom operators.
Anton also had the opportunity to work with one of the largest banks in Russia and Eastern Europe to build a new private cloud based on the VMware Cloud Foundation stack. More than ten thousand virtual services and machines were migrated to it to improve performance, capacity utilization and transparency. The remote access infrastructure was deployed for 30 thousand users in all branches of the bank (more than 90 in different regions).
Another example is the project of US cable and satellite TV operator DISH, which created a virtualized autonomous 5G broadband network. For DISH, Davidovskii, with the help of his team, realized a universal scalable platform that allowed the client to provide new services (including private 5G networks and corporate B2B applications) promptly, without unnecessary effort and cost. Today, the DISH 5G Open RAN network covers more than 70% of the US population.
Teamwork: The Matrix Structure in Action
In project teams, Anton explains, the structure typically follows a classic matrix structure. Engineers specializing in virtualization, data storage, computing, and other areas are responsible for the development and implementation of solutions. Working alongside them, architects provide a broader vision of IT infrastructure and may also have their own technical specializations, such as building communication systems for operators. Additionally, project managers play a crucial role by organizing client meetings, gathering requirements, and facilitating approvals.
At the project's start, a responsible manager and an architect are appointed to establish vertical and horizontal communication among different engineering groups. This is essential, as teams may design solutions based on their own interpretations, potentially missing the latest requirements and developments in other technological branches.
On the client side, many stakeholders and departments are often involved, sometimes totaling up to 20 individuals, ranging from senior business managers to technical specialists. For example, an enterprise may have its architectural department overseeing diverse tasks, such as designing business processes for a multinational corporation and planning a network for a specific data center. Managing relationships with these stakeholders is also a key part of the job.
Secrets of Effective Communication
The first critical phase of a project is gathering requirements, which can range from reducing IT system costs to improving specific parameters like fault tolerance. Sometimes clients provide clear definitions, while at other times they simply request that the solution "work wells." In any case, it is the IT architect's responsibility to gather and analyze these requests, taking into account all possible constraints and risks.
During this stage, it's vital to identify conflicting requirements from different departments and top managers, each with their own goals and criteria. Anton shares an example: the information security department always advocates for maximum restrictions, even if it complicates user experience—jokingly referred to as needing to "look at the sun to complete the login." In contrast, the business development team has entirely different priorities.
Finding compromises and mediating between departments is necessary. The primary goal is to help the client acknowledge that their requirements and expectations align with their actual needs. Often, clients may not fully understand their own requirements. For instance, an organization might have one department responsible for servers and another for data systems. When implementing converged storage technology—where servers also act as data storage nodes—responsibilities would need to be reassigned.
The second rule is always to invite someone to take notes during client meetings. Documenting agreements and confirming them with stakeholders afterward helps prevent conflicts. If any points remain unclear, it's better to hold additional discussions to clarify. Finally, flexibility is key; sometimes it's worth accommodating the client, while in others, it's important to remind them that decisions were made months ago and can't be changed. Each project requires an individualized approach to navigate these issues effectively.
From Core to Edge: The Demand for Edge Computing
Anton Davidovskii emphasizes that edge computing is currently in high demand among enterprises. This distributed computing paradigm involves hardware components spread across numerous locations—such as cell towers or tractors in fields—rather than being concentrated in a few large data centers. As a result, some computing and processing occur directly on devices like computers, cameras, and sensors, rather than relying solely on centralized clouds and data centers.
Edge computing allows for faster data processing, reduced latency, lower network load, and improved application responsiveness. Moreover, devices can continue functioning even without network access. A prime example of edge computing in the enterprise sector is found in factory systems that collect and process equipment data, helping to prevent accidents and increase production efficiency.
The demand for this technology is growing, with Statista predicting that the global market will reach $350 billion by 2027. However, there is currently a shortage of mature solutions for large enterprises. In response, expert and his team are developing a product designed to enable flexible and automated management of a large, distributed fleet of devices, requiring minimal involvement of highly skilled personnel.
The expert points out that several challenges must be addressed in edge computing, including ensuring network connectivity, preventing device theft in remote locations, securely storing critical data, and updating device information when necessary. His team is producing prototypes of standardized solutions tailored to specific industries or tasks. These solutions could range from a router-like device to an industrial-grade computer designed to withstand external factors. Once powered on, the device will launch the necessary applications for its intended scenario.
Another trend in edge computing, Anton adds, is edge AI. This approach utilizes pre-trained AI models for specific tasks that perform computations directly on edge devices.
Training and fine-tuning AI models typically require massive computational resources. According to OpenAI's technical review of GPT-3, each training iteration required at least USD 5 million in GPUs. However, once deployed, these AI models can run on relatively modest hardware with only a slight drop in performance, which is the essence of edge AI.
One of the key advantages of edge AI is its ability to respond to events in real-time, which is vital for applications that require instant reactions, such as autonomous vehicles, video surveillance systems, and industrial robots. Furthermore, edge AI optimizes resource usage by reducing the need to transmit large volumes of data to the cloud.
Migration Between Private and Public Clouds
Commenting on the trend of enterprises moving from public to private clouds, Mr. Davidovskii notes that public clouds offer significant advantages. Companies can avoid investing in hardware, scale resources according to business needs, and access data from anywhere in the world. Additionally, providers like AWS, Microsoft Azure, and Google Cloud regularly update their platforms, offering the latest technologies and tools without complex setups.
However, these advantages come with a downside, Anton states. Public clouds may appear cost-effective at first glance, but this is illustrated by the renowned article "The Cost of Cloud, a Trillion Dollar Paradox" by the a16z fund. Companies often struggle to manage cloud resources efficiently, leading them to purchase more capacity than necessary. Moreover, providers may charge extra for backup storage, data transfers between regions, and even support services.
Another issue is data storage. Moving data out of the cloud is often more complicated than moving it in, creating a form of vendor lock-in for enterprises. Additionally, regulatory compliance is a significant concern; private clouds can be easier to configure to meet strict industry standards, such as HIPAA for healthcare or GDPR for data protection in the EU.
Vendor lock-in can also manifest itself in different ways. Major vendors offer a range of useful tools—from infrastructure products for computing and data storage to innovative solutions in AI, analytics, and IoT, as well as basic messaging services. This means that enterprises are not just using abstract resources and virtual machines; they are tied to specific products, making it challenging to switch providers.
Davidovskii points out that he has observed both transitions from public to private clouds and vice versa in his work. Thus, he believes the future likely lies in hybrid models, where companies combine public virtual resources with their own infrastructure.