The Interact team | March 3, 2021
Rich Kenny is Managing Director at Interact. He is also Group Sustainability and Research Director at Techbuyer, an influencer for the Northern Intelligence forums, and an expert in leading technical, research and innovation projects. We caught up with Rich to ask how he got into the IT industry, what he has been up to, and how Interact is going to help today’s businesses.
I’ve always loved computers and bought my first Unix server from a liquidation auction when I was about 14. I broke it down to components and sold it in pieces to make a decent profit. It was massive, though, and a lot of it sat on our kitchen table in the flat for months. I used to build PCs from being 10 years old - 386s mainly – and remember trying to reprogram my school’s only PC when I was 9. That’s where I started. My mum is a business owner and has always worked for herself. She has run a few different businesses and always been successful, so I learned from an early age what it takes to run and manage a business. For me personally it has been an odd journey but started properly when I was at Uni, studying International Development. I won an internship at Adam Smith International, took an intercalation year to go work with them and got a taste for large scale project management. I started working in digital and technology and built a career in IT from there.
At Techbuyer I’m the Group IT Director. I look after a wonderfully diverse team that covers a lot of key support services; Internal IT, Systems administration, Software development, Quality Assurance, Online sales (Ecommerce and Marketplaces). We’re an IT business so my remit is Digital Transformation and leveraging technology appropriately both for us and our customers. A key part of that is ensuring circularity is involved in all our decision making. Hence why our Sustainability Lead, Astrid works with me. It’s through that piece of the puzzle that the creation of the Techbuyer and University of East London KTP (Knowledge Transfer Project) got started. Since it was research and development focused around technology it landed on my desk. The work on the project – creation, scoping, building equipment, benchmarking, Quality Assurance, and finally development and creation of the tool – lead to Interact being created.
We see 3 key uses for the application of the software: 1 – State of play reporting for IT sustainability. At a very basic level you can use the tool to audit your existing data centre IT Infrastructure. The tool will provide you with a comprehensive report on your current energy, scope 2 and scope 3 utilisation. This is useful information to help inform your ongoing strategy on reducing emissions and energy waste as it will provide smart recommendations to optimise based on your current situation. This will be invaluable for audit firms, energy consultants and data centre managers. 2 – You could then use the report to inform your refresh strategy on your IT hardware. You now know the start point and the recommended actions. You could carry out rolling refreshes on your 10% least energy efficient servers, assess and get smart recommendations on replacements, and then implement – before running the reporting software again to see what benefits this has made real. This could inform your IT strategy for the next 5 years allowing you to address your worst performers every year. This is vital for Technical managers and those who look after data estates in private cloud or on premise. 3 – Rack level consolidation – Hybrid environments are becoming crucial for businesses. Large businesses find they have a need to consolidate rack space, either because they have moved some servers and applications off-site or have been part of mergers of businesses, and now have multiple sites that can be taken offline or reduced. The tool allows a rack level approach to consolidation by being able to analyse the kit in and across specified racks. This gives a flexible approach where efficiencies can be approached one rack at a time. In one recent example, we provided options to potentially consolidate 35 racks into 9. This might not be fully possible based on the number of nodes needed but from a performance standpoint this would have been achievable.
The tool uses a massive database of servers, components and configurations, which has been built up over 15 years of working in the sector. It combines this with benchmark experiments, that have been run over the last 2 years against industry recognised SERT scores, to provide a base for mathematical models for energy calculations and machine learning to be applied. This means we can analyse the energy consumption of server infrastructure with over 90% accuracy and make recommendations to reduce both energy and C02 consumption. The result is significant cost savings running to the millions of pounds even in small data centres. No other tool like this can be run independently of the data centre and make hardware recommendations for scope 2 and scope 3 emissions whilst remaining completely vendor neutral. It creates incredible flexibility on how to optimise and long-term manage your IT infrastructure.
I think a large issue for data centres is developing a strategy to edge-based computing. So many data centres do not have an incentive to become more energy efficient because they charge by the watt. It’s a twisted mechanism that is not going to be possible for edge deployments. Space will be a premium and performance will need to be carefully managed to make the most of that available space. With the physical infrastructure optimisations that work for hyperscale data centres no longer applicable, the sector will have to address efficiency on a much smaller scale – at hardware and rack level.
It’s a great piece of original research and I’m very proud to have contributed to it. Working on the benchmarking with Nour was fascinating and the results were remarkable. The paper proved that the way CPUs have been developing has led to incremental but not necessarily dramatic improvements in top end performance, whilst trading off some of that top end ability with higher idle and low compute efficiency. For servers, where the CPU uses around 65% of the power requirement, this means that some older CPUs and generations of servers can actually be more efficient than newer ones, dependent on their workloads. The tests on configuration effects were equally important. They showed that correct configuration of RAM, CPU and storage in any generation was critical for making the best decisions on performance versus efficiency. This was combined with hundreds of hours of benchmarking of new versus refurbished components where the performance difference between them was statistically insignificant. In short it proved that refurbished and new components and servers perform identically. Google has also known this for years on a component level but to be able to prove it and have the research scientifically and academically verified is incredible.
I think Interact can be a significant force for good in combating the increasing impact the data centre industry has on the environment. The savings we have been able to help make on energy, C02 and ultimately costs could be massive. Long term I want every data centre to look at its hardware effectiveness and make the right decisions to reduce their impact. In 2021 I’d like to think we were launched and operational in all our key regions worldwide. I’d expect us to have contributed a 2nd academic paper based on additional research and be working with around 50 data centres. That’s the plan. Ambitious, but I think we have a shot here to do something special. Get in touch today to see how Rich and his team can help your business.