Celebrating & Reflecting on My 30 Years Career in IT

My career in information technology (IT) started around 1993 after completing my master’s degree in computer science. This year marks thirty years spent working in a fascinating and challenging technical field. This article is a peek into my professional journey.

Celebrating & Reflecting on My 30 Years Career in IT
Image generated using Dall-E

While walking outside under a beautiful sunny sky, I reflected on my career and realized that this year marked three decades of hard work in this wonderful world of computers and software. I barely couldn’t believe it. Time flies so fast. I remember vividly my last days as a computer science graduate student, looking ahead and trying to imagine what I would do next. It was hard to imagine how my duties would evolve. One reason is that the computer industry was and still is a fast-moving field. The innovation and transformation pace didn’t slow down during all these years. Over three decades, my interests have always been directed to the infrastructure side of the information technology field. As passionate about programming as I was during my university years, I wasn’t a great programmer. A good one, maybe, but not much more. So, my career reflects that, and it was primarily influenced by new technologies that popped up along the way. This is why I wish to reflect on my career by dividing this journey into four phases, each marked by the introduction of some game-changing technologies.

Pre-virtualized IT world, client-server apps, local area networks (1993-2003)

Inside a computer room, circa 2010
Inside a computer room, circa 2010

Before elaborating on my role in the first segment of my career, let’s define a pervasive client-server application, a paradigm of computing model back in the day. This definition is mandatory to set the proper context in which I worked.

In the 1990s, the client-server application architecture was widely prevalent. It is a distributed computing model where application processing is divided between two types of computers: clients and servers.

In this architecture, the client is a computer or device that requests and consumes services or resources from the server. On the other hand, the server is a powerful computer or device that provides services or resources to clients upon request.

Key characteristics of the client-server architecture in the 1990s include:

  1. Client-side processing: Clients are responsible for presenting the user interface and handling user interactions. They perform minimal or no processing of data or business logic.
  2. Server-side processing: Servers handle the core processing tasks, such as data storage, business logic, and computational operations. They respond to client requests, process data, and return the results.
  3. Communication through a network: Clients and servers communicate with each other over a network using protocols like TCP/IP. Clients send requests to servers, and servers respond with the requested data or perform the requested actions.
  4. Centralized data storage: In many client-server architectures of the 1990s, databases and data storage were typically centralized on the server side. Clients would send queries or requests for data to the server, which would retrieve and return the data.

Overall, the client-server architecture of the 1990s emphasized centralized control, with servers acting as the backbone of the system, handling the majority of processing tasks and data storage, while clients focused on user interface and interaction.

Me inside a computer room, circa 2005
Me inside a computer room, circa 2005

The backbone of this landscape was distributed data centers and open space offices where working from home was exceptional. Countless ins and outs to the computer rooms to install something new, fix something else, check this or that were the background of my daily work. Everything being physical, nothing was easy and needed so much change management. If we ordered a new server with the wrong configuration, we were stuck with it for many years. We were proud of our data centers, but a lack of flexibility and agility came with them. Moving a data center was a significant undertaking.

It was also an era of physical consolidation: small and remote computer rooms were being phased out and migrated into larger buildings with dedicated space acting as data centers. Wide area networking benefited from higher speed links, which enabled this centralization. In the next segment, you’ll learn that centralization was a mandatory step to trigger the next wave: virtualization at a massive scale.

The virtualization of everything (2003-2015)

Server virtualization gained popularity in the early 2000s primarily due to advancements in hardware technology, the need for better resource utilization, and cost-saving opportunities. Here are the key factors that contributed to its popularity:

  1. Hardware Advancements: The increasing power and affordability of processors, memory, and storage allowed servers to handle larger workloads. This made it feasible to partition physical servers into multiple virtual machines (VMs) without sacrificing performance.
  2. Server Consolidation: Many organizations had multiple physical servers that were underutilized, with each server running a specific application or operating system. Server virtualization introduced the concept of consolidating these servers into a single physical machine by running multiple VMs. This efficient use of resources helped reduce operational costs.
  3. Flexibility and Scalability: Virtualization provided the ability to create, manage, and replicate VMs easily. This flexibility allowed IT departments to quickly provision and deploy new servers or scale resources up and down as needed, without the need for physical hardware procurement and installation.
  4. Improved Disaster Recovery: Virtualization offered improved disaster recovery capabilities by enabling VM snapshots, backups, and faster recovery times. In the event of a server failure or disaster, VMs could be quickly restored on different physical servers, minimizing downtime and data loss.
  5. Testing and Development Environment: Virtualization provided a cost-effective way to create isolated testing and development environments. Developers could easily set up VMs to test new software or configurations without affecting the production environment.

Overall, server virtualization proved to be a game-changer in IT infrastructure management, offering improved utilization, scalability, flexibility, and cost savings. These factors contributed to its popularity and widespread adoption in the early 2000s.

I was a big fan of virtualization, popularized by VMware, now part of Broadcom (sadly). Virtualization helped drop the barriers and make things faster and even more possible. As everything slowly became virtualized, servers, storage, networks, and old habits needed to be dropped as new ones came to take their place. Mostly gone were the ins and outs of the data center computer rooms. Everything could be done remotely, silently, and efficiently. The virtualized world moved faster at the click of the mouse. It was a great time to work in IT.

During this period of my professional life, I became more involved in infrastructure architecting projects and a bit less in level-1 and lever-2 operations and troubleshooting. It was technically fun and challenging. I learned so many concepts and principles that I still use today in my day-to-day work when designing system architecture.

The virtualization of the IT work paved the way for the third era in my professional life: the cloud.

The “cloudification” of the IT landscape (2015-now)

Now, let’s see how virtualization played a major role in enabling the cloudification of the data center.

Virtualization played a crucial role in paving the way for the introduction of cloud computing in companies later on. The key contributions of virtualization in the early 2000s were as follows:

  1. Server Consolidation: Virtualization allowed multiple virtual machines (VMs) to run on a single physical server, effectively consolidating hardware resources. This consolidation resulted in better hardware utilization, reduced data center space requirements, and lower power consumption. By optimizing resource usage, virtualization laid the groundwork for the cost efficiencies that would later be associated with cloud computing.
  2. Isolation and Security: With virtualization, each VM operates in an isolated environment, providing a level of separation and security between different applications and workloads. This isolation helped address concerns related to application compatibility, security breaches, and resource conflicts. The ability to securely isolate workloads paved the way for building multi-tenant architectures in the cloud, where multiple organizations can coexist on shared infrastructure.
  3. Elastic Resource Allocation: Virtualization introduced the concept of dynamically adjusting resources allocated to virtual machines. IT administrators could scale virtual machines up or down based on demand, without the need to add or modify physical hardware. This ability to elastically allocate resources aligned with the scalability requirements of cloud computing, where companies can quickly provision or deprovision resources based on workload and user demands.
  4. Data Center Virtualization: Virtualization extended beyond servers to encompass storage and networking. It enabled the abstraction of storage resources, allowing for efficient management and allocation of storage capacity. Similarly, network virtualization provided the ability to create virtual networks, enabling flexible and isolated network configurations. These advancements in data center virtualization set the stage for the "software-defined" infrastructure approach widely adopted in cloud environments.
  5. Infrastructure Abstraction: Virtualization abstracted the underlying hardware layer, providing an intermediate layer between the operating system and the hardware resources. This abstraction allowed for greater compatibility and flexibility. Applications and workloads could run on virtual machines without being tied to specific hardware configurations. This decoupling of applications from hardware laid the foundation for the portability and mobility that cloud computing later expanded upon.

Virtualization’s early adoption and success demonstrated the benefits of consolidating and abstracting IT resources, optimizing resource usage, and providing flexibility and scalability. This experience created a strong foundation and mindset around leveraging shared resources and on-demand provisioning, which were fundamental principles later embraced by cloud computing models. As a result, the advancements in virtualization technology played a crucial role in shaping the concepts and practices that paved the way for introducing cloud computing in companies.

I’ve been involved in many projects with a wide diversity of clients, and most of the time, the cloud is part of the technical landscape to some degree. Some clients are moving into the cloud while others partially do it. I would argue that in the last few years, we’ve seen a more profound transformation of the IT landscape than ever before at all levels of the organizations: technology, people, and processes. Everything is changing thanks to the cloud’s foundational principles. Some people see the cloud as a danger and refuse to adopt it, while others fully embrace it. The latter will survive this significant and profound wave of change. I’m trying to surf that wave myself. I have much work to do and many things yet to learn. However, cloudification is here to stay, as organizations will take many more years to adapt and transform themselves. It won’t be done by the time I retire, for sure. But there is one more thing that is popping up that will keep me interested and fascinated about IT: AI.

Information technologies went from a physically-defined world to a software-defined one in a steady and fascinating way.

Augmented Intelligence everywhere

The advent of AI as a consumer-facing technology triggers many discussions and debates. We’ve been talking about AI for many decades. While studying at university and as a graduate student, fuzzy logic systems, expert systems, and neural networks were hot. But, in 2023, AI made a pivotal change.

AI is a crucial information technology trend sparking significant discussions about ethics and the role of IT in our lives. While IT has always been intertwined with ethical considerations, the advent of AI brings these discussions into sharper focus for several reasons:

  1. Decision-Making Power: AI's ability to make decisions, sometimes without human intervention, raises questions about accountability, transparency, and fairness. The ethical implications of AI-driven decisions, especially in critical areas like healthcare, law enforcement, and employment, are a major concern.
  2. Privacy and Data Security: AI systems often require vast amounts of data, including personal information, to learn and make decisions. This raises significant privacy concerns and highlights the need for robust data security measures to protect sensitive information.
  3. Bias and Discrimination: AI systems can inadvertently perpetuate and amplify biases present in their training data. This can lead to discriminatory outcomes, particularly in areas like hiring, loan approvals, and law enforcement. Addressing AI bias is crucial for ensuring ethical AI deployment.
  4. Job Displacement and the Future of Work: AI and automation bring concerns about job displacement, with AI potentially replacing human workers in various fields. This leads to discussions about the future of work, retraining, and the social responsibilities of organizations deploying AI.
  5. Control and Autonomy: Advanced AI raises concerns about human control over technology. Ensuring that AI systems do not act beyond their intended purpose and that humans retain ultimate control is a key ethical issue.
  6. Long-Term Impacts: The long-term impacts of AI on society, including social structures, human behavior, and our understanding of intelligence, are still unknown. This uncertainty prompts discussions about how to guide AI development responsibly.
  7. Global Regulation and Standards: The global nature of AI technology poses challenges in creating unified standards and regulations. Different cultural, legal, and ethical standards across countries complicate the establishment of universal guidelines for AI development and use.

AI is not just a technological trend; it's a catalyst for the profound ethical and societal transformation of our relationship with technology and computers. AI will shape the role of IT in our lives. Related discussions will help determine the path of AI development and integration into various aspects of society.

AI will be the next frontier, the next pivot in technology that will touch every aspect of our lives, both in the IT world and the non-IT world. This will make this wave the next iPhone or Netscape moment. Many products and services will be adapted and augmented with AI-based features to help execute tasks. How we use systems in the future will be quite different from today. I can imagine system documentation to be available as plain text but also wrapped inside an AI engine where users will prompt the documentation so they can get useful information about how to accomplish complex tasks. Then, later, the system will evolve to ask the user if he or she would like the system to do it itself.

Being part of these transitions for the next ten years, which will mark my forty years in IT, is promising to be even more exciting. The opportunity to watch another wave of technical transformation is full of potential for learning new things and adapting to new market requirements. Who knows what awaits me for the last decade of my professional life? I’m lucky enough to witness four different technological eras, and I appreciate each wave because they brought meaningful improvements and challenges to my career and society in general.

Disclaimer: portions of this article were written using prompts and replies with ChatGPT to summarize technological landscapes in specific eras. See guiding principles for AI usage in articles.