Microsoft seems to be paying a lot to buy LinkedIn if you just look at the “P&L” side of the deal.
But if you examine the long term MSFT strategy of making Office 365 thepervasive business and social platform for documents, presentations and spreadsheets, then it makes the deal worth the investment cost.
In doing work for a local healthcare product venture, I was asked to look at the network and database requirements to support mixed content transactions, video streaming all while conforming to HIPAA compliance standards. As a part of this work, I developed a Web Services Cloud-based architecture that took into account, ERH, HL7, document management and provider notation.
This tasking led me to a deep dive on the data architecture and DB requirements analysis that was required to develop the architecture.
The question of utilizing standard RDBMS (SQL) VS NoSQL was an immediate consideration. My conclusion…. It depends on a large number of technical, business and regulatory factors to derive the appropriate architectural answers. For example, what other external systems are interfaced to the applications and how do they require interaction? In general with the prolific growth in web services, mobile and cloud computing today’s enterprise will require a polyglot data architecture to satisfy all stakeholders.
A look at Healcareinformatics provides an operational insight into some of the complexities.
“Healthonomics” can be the key driving factor to trigger enterprise decisions to support multiple types of DB solutions woven together in a heterogeneous way delivering a network of web services that affect healthcare outcomes.
As we start to see the uptake in 4K video content, suppliers of CPUs, NIC (Network Interface Cards), networks (LAN, WLAN, Wi-Fi) and storage technologies will all be struggling to “step up to the plate” in meeting the challenges of this disruptive Video format. Also IAAS platform providers will face huge challenges to configure cloud components that can be rapidly provisioned for 4K content or Video Streaming. Even the security industry will be affected regarding the video surveillance infrastructure (see this Video Security Magazine article).
This is a Technologies Strategic Directions “Sleeping Inflection Point” for multiple industries, manufacturers, eworkers and information consumers.
Ultra-high definition (UHD) resolution is 3840×2160 Pixels now used in displays and broadcast., This does not equal 4K (4096×2160 Pixels) used in digital cinema. People tend to used them interchangeably but there is a significant difference in impact on the networking bandwidth required to service consumption of 4K.
We all are aware from a display technology perspective that TVs are now offering this content. However, how about other network and computer infrastructure components? When will they be able to handle the disruptive impact of 4K?
This week IBM, Samsung, New York State, and Global Foundries announced a new high capacity silicon chip made with a combination of Silicon and germanium.
Are IBM et al, leading us in the right direction? As the width of connections on chips reach the atomic diameter of the individual atoms of the silicon connectors, EUV etch stations and change in deposition technology are just he tip of the CAPEX impact required to transition and follow the consortium’s lead. At approximately $2B per FAB cost in the near future, who can afford to follow? What ripples in the ecosystem of silicon equipment manufacturing will this cause and at the commodity pricing of today’s market can the ASPs tolerate this new move? Even though Intel mentions 7-Nano occasionally there seems to be no defined roadmap to get there. Consortiums and research are good things. However, we now have to figure out practical steps to get to the future the consortium has described.
A recent article of an interview with the Red Hat CEO touts the benefits of private cloud implementation. See it at HERE
This debate is usually short sited and doesn’t include all CAPEX & OPEX cost associated with the “Free OS” type of cloud operations. Also the reusable components from more sophisticated partner communities afford both AWS & AZURE much greater long term valuations when responsible Enterprise accounting methods are used to drive the cost benefits analyses. The proper engineering of a cloud infrastructure which includes smart VMs well orchestrated by business-demand-level-driven auto scaling will always push the TCO/ROI argument to a public solution for large scale systems.
Microsoft actually has a TCO tool that they can use to estimate TCO of on-premises vs. Azure. There are many considerations when comparing costs of running an on-premises datacenter with full infrastructure, servers, cooling, power etc to a cloud-based service like Azure where you pay a cost based on the services consumed such as storage, compute and network egress. It can be difficult to know exactly what typical costs are for your datacenter and what the costs would be for services running in Azure. Microsoft has a pricing calculator available at http://azure.microsoft.com/en-us/pricing/calculator/ which can help access costs for Azure services and a VM specific calculator at http://azure.microsoft.com/en-us/pricing/calculator/virtual-machines/.
When running on-premises, you own the servers. They are available all the time which means you typically leave workloads running constantly even though they may actually only be needed during the work week. There is really no additional cost to leave them running (apart from power, cooling etc). In the cloud you pay based on consumption which means organizations go through a paradigm shift. Rather than leaving VMs and services running all the time companies focus on running services when needed to optimize their public cloud spend. Some ways that can help optimize services running are:
Auto-scale – The ability to group multiple instances of a VM/service and instances are started and stopped based on various usage metrics such as CPU and queue depth. With PaaS instances can even be created/destroyed as required
Azure Automation – The ability to run PowerShell Workflows in Azure and templates are provided to start and stop services at certain times of day making it easy to stop services at the end of the day then start them again at the start of day
Local Automation – Use an on-premises solution such as PowerShell or System Center Orchestrator to connect to Azure via REST to stop/start services
As a member of the Windows Insider Program I have had a while now to install, investigate and update the Windows 10 Technical Preview. I took a 12 year old HP Windows 7 PC and upgraded it to Windows 10. All of my hardware & programs worked flawlessly. The new browser (code named “Project Spartan” ) is really a big step forward in performance and functionality. Key features are built-in natively to the browser and their major purpose is to make web-services content easier to read, share and comment on.
Other more important revelations have to do with the stated goals of Microsoft regarding their key technology – Windows. According to the new CEO Satya Nadella, “Windows 10 marks the beginning of the more personal computing era in the mobile-first, cloud-first world.Our ambition is for the 1.5 billion people who are using Windows today to fall in love with Windows 10 and for billions more to decide to make Windows home.” If you look at the new offerings superimposed on the backdrop of existing “must have” applications, Microsoft seems determined to make itself through Windows the big dog in the internet services arena. According to Terry Myerson, “We think of Windows as a Service – in fact, one could reasonably think of Windows in the next couple of years as one of the largest Internet services on the planet.”
He continued to state that, “Windows 10 is the first step to an era of more personal computing. This vision framed our work on Windows 10, where we are moving Windows from its heritage of enabling a single device – the PC – to a world that is more mobile, natural and grounded in trust. We believe your experiences should be mobile – not just your devices. Technology should be out of the way and your apps, services and content should move with you across devices, seamlessly and easily. In our connected and transparent world, we know that people care deeply about privacy – and so do we. That’s why everything we do puts you in control – because you are our customer, not our product. We also believe that interacting with technology should be as natural as interacting with people – using voice, pen, gestures and even gaze for the right interaction, in the right way, at the right time. These concepts led our development and you saw them come to life today. “
My firm has been engaged by one of the world’s largest small appliance manufacturing PRC-based companies to architect and implement a cloud/mobile/appliance IoT offering. This new small wine appliance will be launched in Q4 of 2014.
In fact this is an exciting project where WilQuest is Partnering with Microsoft, interKnowlogy, Tridea Partners and others to create a “Cloud of Things” CoT infrastructre where a global software/hardware engineering team is developing products on Azure,Windows 8, Android, iPhone, iPad, Intel and ARM platforms to create a seamless web srevices orchestration of devices and applications that each perform a segment of a task that the end user request via gesture/mouse/keyboard action.
I’ve been interacting with other Intel Alumni regarding the Internet of Things and various prognostications about the work going on between Intel & Microsoft. Some believe there’s no room for Intel in this domain space to the I posted on an Intel Alumni social media site:
“Don’t count this partnership out quite yet…. The new management teams at both of these companies are well positioned to drive innovation and deliver a cost effect offerings in the IoT space. MSFT’s purchase of Nokia along with the Surface device push means that a lot of folk at MSFT are now focused on delivering HW/SW bundled solutions. If any collaboration can find a way to do it the Intel army along with a reinvigorated Bill Gates having a hands-on role at MSFT (from an architecture and creativity function) could cause an inflection point for the IoT product domain.”
Also, look at this video from our friends in Redmond:
As heterogeneous computing starts to grow, intelligent networking will be the facilitators of smart enterprise systems architecture. Basically hardware vendors are beginning to put intelligent silicon on network adaptors. This provides the ability through deep packet inspection to realistically provide Network Function Virtualization (NFV) and true Software Defined Networks (SDN) as a part of hardware/software computing infrastructure. This requires an intelligent NIC, Software Defined Networks (SDN) & Web Services/Cloud Servers must be engineered to “be aware” of the intelligence in the hardware so that software can make smart choices based on business logic context.