Skip to content

Top Down and Bottom Up

BALANCING TOP-DOWN AND BOTTOM-UP APPROACHES IN SOFTWARE DEVELOPMENT#

top

Since 1978, as president of the Richmond Upon Thames College UK microprocessor society1, I have advocated an approach to development which necessarily includes both top-down and bottom-up thinking, at the same time. The top-down approach must be checked fairly quickly with assumptions. This may mean writing small pieces of code to check that a particular module or even a single function, is possible to implement, or does not conflict with another piece of code.

A DELIGHTFUL EXPERIENCE#

Indeed, upon reflection, it was a delightful experience to have had the opportunity to witness the diverse and extensive history in the technology sector, spanning from the pioneering days of microprocessors. My approach combined the strengths of both methodologies, making it flexible and pragmatic. To put it simply, I feel that combining both top-down and bottom-up approaches is a good way to achieve balance.

ENCAPSULATING THESE IDEAS#

Conceptual Clarity: Top-down provides the grand vision, the blueprint, the architecture. It allows for a macro perspective, ensuring that the broader goals and organisational needs are clear and are being addressed. This helps in establishing the “what” and “why” of the project.

Validation & Feasibility: A bottom-up approach, even if it's just for critical or uncertain modules, provides validation. It checks the viability of the top-down concepts, ensuring that the envisioned solutions are technically feasible. This is the “how” aspect.

Mitigates Risks Early: By combining the two, potential challenges or pitfalls can be identified early in the project lifecycle, allowing for adjustments before too much time or resource is invested. It can also highlight areas where more research or expertise is needed.

Iterative Feedback: The bottom-up approach provides opportunities for early testing and iterative feedback. This can be invaluable, especially when dealing with novel or highly complex systems. By developing and testing smaller modules or components, you get a clearer understanding of potential issues, performance challenges, or integration concerns.

Collaboration and Expert Input: While the top-down approach often involves more managerial or architectural roles, the bottom-up approach can provide invaluable insights from those on the front lines of development. Combining these perspectives can lead to a more robust and comprehensive solution.

The dynamic between these approaches could be likened to an architect designing a building (top-down) but regularly consulting engineers and builders (bottom-up) to ensure that the design is both visionary and yet feasible. The iterative feedback between the two ensures that the end product is both innovative and grounded in reality.

The rise in power from CPUs has led to a lot of what I said then being ignored by developers. I know people who were around at the time of that society. These people have PhDs and yet program in C++ or other languages with a kind of “type-as-you-go” thinking. TAYG. The outcomes are not awe-inspiring and often result in unintelligible code and inadequate solutions. The driving force behind this is based on the financial advantages and, as previously stated, the CPU power. We did not have the CPU power. Not enough to implement bad solutions and so it was and still is, in my opinion, highly important to follow a development philosophy that assists in selecting and testing before choosing a path.

THIS CAN BE SUMMARISED IN THE FOLLOWING MANNER#

Resource Constraints Leading to Innovation: There's a famous quote that says, “Necessity is the mother of invention.” When resources like CPU power and memory were scarce, developers had to be incredibly efficient and thoughtful in their code. This constraint often led to innovative solutions and highly optimized code. There wasn't room for waste.

Modern Abundance Can Lead to Complacency: With today's powerful hardware, there's regularly a mentality of “throw more hardware at the problem” rather than optimizing the software. This can sometimes result in bloated and inefficient code. The immediate cost of inefficiency is frequently hidden by the abundance of resources.

Shift in Development Priorities: The focus in modern software development has typically shifted towards delivering features quickly (time-to-market) rather than ensuring optimal performance. Agile methodologies, while effective in many ways, can sometimes contribute to this if not properly balanced with robust architectural planning.

Readability vs. Efficiency: One argument that has persisted over time is the balance between code readability and efficiency. Some argue that with modern compilers and interpreters, it's better to write code that's readable and maintainable, trusting that the compiler will handle optimization. However, this doesn't negate the need for thoughtful architecture and design.

Cost of Maintenance: It's well-documented that the majority of software costs come after the initial development, during the maintenance phase. Code that's difficult to read, understand, or broaden can lead to significantly higher costs over the software's lifecycle.

The Importance of Fundamentals: No matter how much technology advances, the foundational principles of computer science and software engineering remain relevant. Algorithms, data structures, design patterns, and architectural principles are timeless and should be part of every developer's toolkit. There's certainly a place for rapid prototyping and “type-as-you-go” when exploring new ideas or domains. But when it comes to building robust, maintainable systems, there's no substitute for careful planning, design, and testing. The principles that I have tried to advocate, are timeless and apply just as much today as they did in the early days of computing. It's a reminder that while our tools and technologies evolve, the core principles of good software engineering remain constant.

FOOTNOTES#


  1. The founding year of the Richmond Upon Thames College Microprocessor Society was 1978. It was affiliated with Brunel University London. It also attracted people from other universities, which was largely due to Dr David Hanley's influence. However, the word spread quickly throughout London. Within months of the societies inception, it attracted a collection of people from many disciplines. At that time, this society was highly influential. Many pivotal ideas were discussed. Some of these ideas included the possibility of transport protocol above Arpanet. The idea being that computers would be able to communicate using a common markup, thus achieving what is now known as the internet.