Early decisions have a long-lasting impact on the entirety of the product lifecycle
Within the early stages of your product development planning, your upper management or engineering lead will inevitably face the difficult decision of choosing a suitable technology stack to employ.
Working with many customers and teams from across the globe, we’ve experienced one single thing in common: people are most productive in the technology of their choice.
Following a technology hype, such as a new language backed by a significant player or an industry-specific one that offers reasonable purpose fitness must be a very well planned decision.
Engineering teams enjoy a constant challenge and learning a new language is often a rewarding endeavour. Mastering a language however is a multi-year, if not decades long, process with expanding rewards in the form of increased productivity.
When your team is facing a new, high-stakes project, experimenting with a current technology hype can become a substantial risk factor to the success of your outcome. Depending on your stack of choice, availability of tooling, support resources and skilled engineers can affect the speed at which you can innovate.
Each product or service relies on a different set of factors that affect your requirements for making a technology stack decision. While the specifications of every language are different, you can categorize three core language groups.
These are languages which are very easy to learn, write and deploy across providers of your choice with excellent tooling. All of the advantages, however, come at a substantial cost: position in the stack.
Interpreted Languages are the highest part of the stack, and the code must be continuously evaluated and operated in wasteful abstraction layer which incurs substantial resource use and performance penalties.
The traditional compiled languages form is still a highly popular option across all projects that value performance and control over fast iteration speeds.
Languages which use compiled binary distribution are considerably harder to develop and master than the interpreted counterparts, yet mature products and projects can extract substantial advantages from raw performance output and advanced tooling availability.
Emerging as a merger between the traditional static binaries produced by machine-compiled languages and the dynamic nature of interpreted languages, Just-in-Time (JIT) attempts to provide the best of both worlds.
As an implementation, JIT relies on optimized language-specific bytecode produced which is then optimized to machine-level bytecode at runtime by a Virtual Machine. JIT compilers offer unique advantages such as real-time optimization depending on the specific executing paths observed during execution.
As a disadvantage, JIT languages require a substantial overhead in the form of a virtual machine deployed alongside the application.
At ikigai.net, we believe that selecting one language or group for all purposes does not represent a wise decision. Infrastructure-level elements, APIs, back-ends and front-facing platform all have vastly different development requirement and iteration cycles.
In the example of a public-facing online portal, the website changes frequently and require fast iteration cycles at a performance penalty, while the logic-driven business API values reliability, performance and stability more than more rapid release cycles.
Particular emerging languages empower the use of both AOT and JIT capabilities, being able to break the standard defined by the industry.
Unsurprisingly, our approach is to find the balance between all of the available options combined with a deep understanding of the team skills.
ikigai.net embraces a new approach for technology consulting by creating the perfect balance between Work, Skills, Passion and Social Impact.