In mathematics, there are certain types of problems that are considered impossible to solve in a reasonable amount of time. These problems are more common than you think and can appear deceptively simple. For example, what is the shortest distance between 85,900 particular cities? That question took 136 CPU years. These difficult problems are called NP-hard.
When researchers attempt to solve problems, they need to determine if the problems are actually solvable, or if they are NP-hard. Researchers know a new problem is NP-hard if they can transform the problem in question into an existing NP-hard problem.
So why did I bring this up? I think a similar transformation might be possible between the IT infrastructure and industrial automation domains, but not in the way you might expect.
Industrial automation is about . . . automation
Recently I sat down with the guys at Automation.com to discuss trends they were seeing. They pointed out that factories are starting to do extremely refined resource allocation to focus on the bottom line. An example is a facility that balances (every few seconds) the cost of power and materials against the production plan to dynamically utilize resources in the most efficient manner possible. The kicker? Factories are doing this now using off-the-shelf software.
The IT transformation
When I heard about this incredible automation going on, it immediately sounded like an opportunity for the IT crowd. As systems continue to migrate towards serviced based billing, costs will likely become more dynamic. Let’s take the cloud as an example.
During peak hours:
– your IaaS vendor may start charging a premium for disk writes
– your PaaS vendor may start charging a premium for database access
– your SaaS vendors may start charging a premium for throughput.
As an architect, you need to start thinking about all of these variable costs in your system design. And you thought the cloud was going to simplify things! So, how are you supposed to manage this incredible complexity? Are you going to develop this expertise in house?
What would the pragmatic programmer do?
Reuse, reuse, reuse. There is an unexplored opportunity to leverage the process automation knowledge and methodologies that exist right now. It’s not just resource optimization – these guys also have lots of experience with real-time systems. As we move to a dynamic world, reaction speed will become a critical part of any architecture and much harder to control as you inherit the performance of your weakest link.
I’m looking forward to further discussion with the IA community around the application of process control IP to the IT realm. My hunch is that there are some killer applications just around the corner for the cloud vendors.
Am I nuts? Time will tell.
Follow me on twitter @joshneland