Microsoft Opens Up APIs, Protocols

Michael Desmond, founding editor of Redmond Developer News and Desmond File blogger, is on vacation. Filling in for him today is Kathleen Richards, senior editor of RDN. You can reach her at krichards@reddevnews.com.

Big interop news out of Redmond this morning. As reported by my RDN colleague Jeffrey Schwartz, Microsoft is making a push to be more open:

"In a major shift in its business model, Microsoft today said it is placing a significant emphasis on standardization and interoperability, saying it will share its APIs, release extensive documentation of its protocols, and is promising not to sue open source developers who use Microsoft's patented protocols for non-commercial implementations."

Read the rest of the story here. --Kathleen Richards

Posted on 02/21/20080 comments


Delphi Goes to School in Russia

Michael Desmond, founding editor of Redmond Developer News and Desmond File blogger, is on vacation. Filling in for him today is Kathleen Richards, senior editor of RDN. You can reach her at krichards@reddevnews.com.

Microsoft isn't the only company that's trying to get students on board early. In early February, Borland's developer tools subsidiary CodeGear announced a sizable licensing agreement for the Eastern bloc. The company joins Corel and other as yet unannounced vendors in a deal with the Russian Federal Agency of Education to provide technology and other resources to teach programming in primary and secondary schools.

"It's a forward-looking investment by a government looking to build the next generation of technologists in the country," said Jim Douglas, chief executive officer of CodeGear, who has just returned from a trip to Moscow and St. Petersburg where his company has a development office. The Russian education program is targeted at students between the ages of 7 to 17 years old.

The up to 1 million-seat licensing agreement involves Borland's flagship rapid-application development environments: the Windows-based Delphi, Delphi for .NET and C++ Builder. "It's a combination of some older versions and newer versions. They are also using some Pascal products," Douglas said.

Forrester Research senior analyst Jeffrey Hammond views CodeGear's recent announcement in the same light as any large enterprise agreement. "It's a good first step," he said, "but to capitalize on it Borland (or an integration partner) and the Russian education system needs to have an aggressive roll-out plan to make sure the copies actually get into the hands of the target users and put into active use.

"It will take some time to do that, and only after that effort will we be able to really judge the impact of the deal," Hammond added.

Right now, CodeGear doesn't have similar agreements in place with any other educational systems. "Education is a significant part of our story around these technologies," Douglas said. "It is something that I am personally trying to re-inject into our company, into our culture and into our focus. But we are certainly not on the doorstep of doing anything this major anywhere else."

What do you think about putting development tools into the hands of primary, secondary and college kids around the world? Will early education better prepare the future workforce? Send your comments, rants or better ideas to krichards@reddevnews.com. --Kathleen Richards

Posted on 02/21/20081 comments


Two If by Sea

In the course of just over a week starting on Jan. 30, a total of five undersea data cables linking Europe, Africa and the Middle East were damaged or disrupted. The first two cables to be lost link Europe with Egypt and terminate near the Port of Alexandria.

Early speculation placed the blame on ship anchors that might have dragged across the sea floor during heavy weather. But the subsequent loss of cables in the Persian Gulf and the Mediterranean has produced a chilling numbers game. Someone, it seems, may be trying to sabotage the global network.

It's a conclusion that came up at a recent International Telecommunication Union (ITU) press conference. According to an Associated Press report, ITU head of development Sami al-Murshed isn't ready to "rule out that a deliberate act of sabotage caused the damage to the undersea cables over two weeks ago."

You think?

In just seven or eight days, five undersea cables were disrupted. Five. All of them serving or connecting to the Middle East. And thus far, only one cable cut -- linking Oman and the United Arab Emirates -- has been identified as accidental, caused by a dragging ship anchor.

So what does it mean for developers? A lot, actually. Because it means that the coming wave of service-enabled applications needs to take into account the fact that the cloud is, literally, under attack.

This isn't new. For as long as the Internet has been around, concerns about service availability and performance for cloud-reliant enterprise apps have centered on the ability of the global network to withstand a malicious attack or disruptive event. Twice -- once in 2002 and again in 2007 -- DDOS attacks have targeted the 13 DNS root servers, threatening to disrupt the Internet.

But assaults on the remote physical infrastructure of the global network are especially concerning. These cables lie hundreds or even thousands of feet beneath the surface. This wasn't a script-kiddie kicking off an ill-advised DOS attack on a server. This was almost certainly a sophisticated, well-planned, well-financed and well-thought-out effort to cut off an entire section of the world from the global Internet.

Clearly, efforts need to be made to ensure that the intercontinental cable infrastructure of the Internet is hardened. Redundant, geographically dispersed links, with plenty of excess bandwidth, are a good start.

But development planners need to do their part, as well. Web-based applications shouldn't be crafted with the expectation of limitless bandwidth. Services and apps must be crafted so that they can fail gracefully, shift to lower-bandwidth media (such as satellite) and provide priority to business-critical operations. In short, your critical cloud-reliant apps must continue to work, when almost nothing else will.

And all this, I might add, as the industry prepares to welcome the second generation of rich Internet application tools and frameworks. Silverlight 2.0 will debut at MIX08 next month. Adobe is upping the ante with its latest offerings. Developers will enjoy a major step up in their ability to craft enriched, Web-entangled applications and environments.

But as you make your plans and write your code, remember this one thing: The people, organization or government that most likely sliced those four or five cables in the Mediterranean and Persian Gulf -- they can do it again.

Have the suspicious cuts to undersea cables generated any concern or evaluation within your organization? Let me know at mdesmond@reddevnews.com.

Posted by Michael Desmond on 02/19/20085 comments


Driving Toward D

A couple of weeks ago, blogger (and sometime RDN contributor) Mary Jo Foley at All About Microsoft wrote about a new programming language in the works from Redmond codenamed "D."

D is a "textual modeling language" that's integral to Microsoft's ambitious Oslo initiative, which RDN has previously covered. Oslo aims to enable Microsoft's dynamic IT strategy by offering tools and resources to help enterprises better plan, model, develop and deploy applications. Oslo is extremely wide-ranging, with aspects of the program driving new versions of Visual Studio, BizTalk Server and the .NET Framework. It'll be 2009 before Oslo actually arrives.

Critical to Oslo is the repository, which will be the central store of digital assets of the enterprise. Access and manipulation of this repository will be enabled, at least in part, by the declarative D programming language. The intent is for business managers and non-technical stakeholders to be able to use D to perform modeling activities.

Don Demsak, a Microsoft MVP and XML expert, thinks we may see a coming-out party for both D and Oslo at the upcoming Microsoft Professional Developers Conference (PDC) in October.

"The D Language is the reason why the PDC was cancelled last year," Demsak said. "All I know is that they [Microsoft] have been very, very quiet about the D language. I'm hoping to see more at the MVP summit, but I really don't hold out much hope for the language, if they have gone toward making it data-driven."

So far, Oslo and D have been flying somewhat under the radar, despite the enormously broad scope of their goals. But we can expect to hear a lot more about these topics as we get deeper into 2008.

Are you looking forward to Microsoft's model-driven approach to application development? E-mail me at mdesmond@reddevnews.com.

Posted by Michael Desmond on 02/14/20080 comments


Redmond's Open Source Binary-to-OOXML Conversion Project

Keep an eye on SourceForge tomorrow for the launch of a Microsoft-led open source software project, which the company hopes will provide powerful conversion tools for existing MS Office binary files. The new project should help Microsoft extend the perceived value of its XML file formats for shops that currently have a large investment in its binary formats.

You can read more about Microsoft's thought process at Brian Jones' blog post, here. This Gray Knowlton blog entry also offers a primer.

Posted by Michael Desmond on 02/14/20080 comments


OOXML Spat Continues

A few weeks back, I wrote about a Burton Group study that took a rather positive view of Microsoft's Office Open XML (OOXML) file format specification, while also casting doubt on the open source OpenDocument Format (ODF). I also published a Q&A with Sun Microsystems Chief Open Source Officer Simon Phipps, offering a bit of a rebuttal to the Burton Group report.

Now, it seems that the Burton Group and the OpenDocument Format Alliance, the leading promoter of the ODF spec, are in a back-and-forth over the conclusions and assertions of the original report. You can find the blow-by-blow, broken into three lengthy blog postings, here, here and here.

While some of this gets a bit chippy, this is exactly the kind of deep dive that helps developers and IT professionals make sense of the posturing and positioning of the two camps.

Have you drawn any XML file format conclusions? E-mail me at mdesmond@reddevnews.com.

Posted by Michael Desmond on 02/12/20080 comments


Office Is Back on the Menu

When Microsoft launched its first Microsoft Office System Developers Conference yesterday, it reminded me of old times. In an era of Web services, AJAX-based rich Internet mashups and portable implementations of the .NET Framework, it's nice to know that Microsoft can still trundle out an old-fashioned, monolithic application platform without a hint of shame or irony.

As RDN contributing editor John Waters reports, Microsoft is touting Office as a platform for development, tying the ubiquitous productivity suite into everything from back-end ERP software to public-facing Web services. To that end, Redmond is promoting Office Business Applications (OBA) as a distinct class of Office-aligned applications for businesses.

During the conference keynote, Bill Gates demoed a FedEx Outlook add-in, called QuickShip, that lets users schedule FedEx deliveries to their contacts from within the e-mail and calendar client. Microsoft also released the Office Composition Toolkit, a reference application based on Office and SharePoint Server 2007 for creating enterprise mashups. Finally, the new Office Live Small Business service extends the value of deployed Office clients by tying them into Web-hosted services -- a classic play on Redmond's Software-plus-Services model.

The effort to uplevel Office is an understandable one. When you have OpenOffice and Google Apps commoditizing everything, you gotta run for the high ground. And in this case, as ever with Microsoft, that high ground is populated by developers. The question is: In a world where IT and users seek openness and flexibility, how compelling is an app dev platform that ties you into a truly massive client?

Are you developing OBAs for the Office platform? What do you like about Microsoft's strategy and what would you like changed? E-mail me at mdesmond@reddevnews.com.

Posted by Michael Desmond on 02/12/20080 comments


Oh, and SQL Server 2008 Is Delayed

I want to tip my cap to RDN columnist Peter Varhol who pointed me to a Joel Spolsky blog post expressing profound exasperation at the way Microsoft reveals delays. Spolsky savages the doublespeak found in this TechNet blog post by Microsoft Director of SQL Server Project Management Francois Ajenstat, who reveals that SQL Server 2008 will be delayed by a full quarter.

Spolsky pulls out this snip from Ajenstat's post, though the rest of it is thick with the florid language of self-congratulation.

"We want to provide clarification on the roadmap for SQL Server 2008. Over the coming months, customers and partners can look forward to significant product milestones for SQL Server. Microsoft is excited to deliver a feature complete CTP during the Heroes Happen Here launch wave and a release candidate (RC) in Q2 calendar year 2008, with final release to manufacturing (RTM) of SQL Server 2008 expected in Q3. Our goal is to deliver the highest quality product possible and we simply want to use the time to meet the high bar that you, our customers, expect."

For those who fell asleep before the good parts, SQL Server 2008 is delayed until Q3.

Now, I'm not one to light into Microsoft for incremental product delays. Building enterprise-class server software is hard, serious work. Delays can, and often must, happen to ensure the quality of the final product. Furthermore, by all accounts SQL Server 2005 has significantly raised the bar for the Redmond database franchise, and the early buzz on the 2008 version is very positive. But this is no way to talk to developers.

Fortunately, the Internet (a series of tubes, if you will) is there to speak for us in the form of Simple-Talk columnist Phil Factor (no, not his real name). In this delicious parody, Mr. Factor shows how this Microsoft-speak might play out if a student used it to answer a teacher's request for late homework.

Can Microsoft do a better job communicating to developers? What would you like to see changed or improved? E-mail me at mdesmond@reddevnews.com.

Posted by Michael Desmond on 02/07/20080 comments


Developers' Take: Microsoft Buying Yahoo

When Microsoft made a $44.6 billion tender to purchase online giant Yahoo, it did more than make a lot of waves in the IT and financial arenas. It also shook the confidence of a lot of developers.

You need look no further than this Mini-Microsoft blog post, which concludes that while developers at Microsoft expect little to happen any time soon, whatever eventually does happen will probably be bad. This snip pretty much sums it up:

Most engineers, as expected of engineers, see all the problems and that it's going to be a staggering mess, let alone that there are things that Yahoo! does way better than us and that our stuff should be dropped. Strategic optimists and those looking for a promotion will rebrand it as a synergistic opportunity to align our technological assets into a virtuous, hyper-competitive cycle to benefit our users, partners, and shareholders.

Did I detect a whiff of sarcasm there? Regardless, there are reasons to be both excited and suspicious about this proposed acquisition. The impact it would have on the numerous competing and overlapping services offered by the two companies would be nothing short of tectonic. And that clash could force developers all over the globe to make, adjust or reconsider decisions.

To get a sense of the integration challenge ahead, and the impact it would have on developers aligned to the Yahoo service stack, take a
look at the fantastic comparison posted on the I Started Something blog. As author Long Zheng notes about the many competing services:

Now imagine for each and every one of these you have to make a decision -- to keep it as is, integrate Yahoo's into Microsoft's, integrate Microsoft's into Yahoo's, or even come up with a new hybrid. Simple branding aside, I think the developers are going to have to work quite a few late nights to integrate what I believe are two monolithic systems together.

Ouch.

We want to hear your take. Would a Microsoft-Yahoo combination disrupt your coding plans? What would you do if such a merger were formalized? Run to Google? Adopt all Microsoft solutions with the assumption that its interfaces and APIs will gain primacy? Or would you stick with Yahoo where it's superior, assuming its implementations might survive the acquisition?

Speak up and let our readers know, at mdesmond@reddevnews.com. We may publish your insights in our coverage of the Microsoft-Yahoo buyout in an upcoming issue.

Posted by Michael Desmond on 02/07/20080 comments


Remembering Jim Gray

It was just over a year ago -- Jan. 28, 2007, to be exact -- that Microsoft research fellow and Turing Award-winner Jim Gray went missing off the coast of California, during what was supposed to be a solo day trip on his 40-foot sailboat Tenacious. Despite an extensive search of the waters off the San Francisco bay, Jim Gray and his boat were never found.

The loss was a devastating one for the development community. Gray was a leading light in the area of database development and transaction processing. He helped create many of the technologies that are today at the heart of modern database and transaction processing systems. In 1995, Gray joined Microsoft to found and manage the Microsoft Bay Area Research Center (BARC), where he worked on a variety of projects. Among them was the Microsoft TerraServer Web site, which provided high-resolution, satellite-based photos of the earth years before Google Earth.

Now, a year after Gray went missing, the Association of Computing Machinery (the organization that holds the Turing Awards), the IEEE Computer Society and the University of California-Berkeley have joined to announce a tribute to Gray, planned for May 31 at the UC Berkeley campus. Jim Gray attended UC Berkeley from 1961 to 1969 and earned the school's very first Ph.D. in computer science.

Fittingly enough, the tribute will also feature technical sessions for registered participants. You can find more information about the tribute here.

Mike Olson, Oracle's vice president of embedded technologies, is scheduled to speak at the event about the search effort for Gray. In a statement released today, he said: "It is important to note that this is a tribute, not a memorial. Many people in our industry, including me, are deeply indebted to Jim for his intellect, his vision and his unselfish willingness to be a teacher and a mentor."

Posted by Michael Desmond on 02/05/20081 comments


Information as a Service

I don't have to tell anyone about the growing problem of complexity that faces development and IT managers. Enterprises find themselves managing increasing numbers of applications, tied to a diverse array of internal and external data sources, services and people. The end result: The information often is out there, but the applications -- and the people who rely on them -- can't get at it.

Enter the concept of Information as a Service (IaaS). Like Software as a Service (SaaS) before it, IaaS aims to break chokepoints that have developed as the scope, scale and interconnectedness of enterprise systems have grown. Where SaaS enables the flexible delivery of trusted applications to endpoints over public and private networks, IaaS enables a more dynamic, flexible and robust means for information access across evolving and growing infrastructures.

Enterprise vendors are already hard at it. From Microsoft's Dynamic IT strategy to IBM's Information on Demand and Oracle's Fusion middleware, the drive to enable flexible, scalable and ubiquitous access to business information is well underway.

Forrester Research recently published a report that looks at the emerging IaaS market. It describes IaaS as employing data virtualization middleware to enable a flexible architecture that minimizes data management through robust automation, while providing real-time data sharing that eliminates the need to ship massive volumes of data to individual applications and servers. Distributed data caching and a virtualized infrastructure deliver low latency performance, and also serve to boost resiliency via redundant data feeds. Centralized data access controls, provided again by a common middleware layer, ensure both security and compliance.

So which vendors are ahead of the pack? According to Forrester, BEA Systems, IBM, Oracle and Red Hat are in the lead, with BEA and Oracle standing out in all IaaS use cases defined in the report. For its part, Microsoft lags because it can't match the sheer breadth of leading platforms, but the report calls Redmond a leader in the high-performance and enterprise search use cases.

Is IaaS something your development shop is starting to look at? E-mail me at mdesmond@reddevnews.com.

Posted by Michael Desmond on 01/31/20080 comments


5 Questions with Sun Microsystems' Simon Phipps

Simon Phipps is chief open source officer for Sun Microsystems Inc. As such, he stands at the center of a heated debate over standards-based XML file formats like the OpenDocument Format (ODF) and Microsoft's Office Open XML (OOXML). With OOXML approaching a crucial late-February review by the International Organization for Standardization (ISO), Phipps has been busy. Really busy.

We talked with Simon and got his thoughts on the OOXML tussle as well Sun's efforts in the open source community.

Redmond Developer News: No doubt you noticed the Burton Group report that casts the OOXML specification in a rather positive light for enterprise-scale organizations. Do you believe that the authors are missing the mark?
Phipps: Yes, I believe they are. As a number of people and organizations like the ODF Alliance have already pointed out, the Burton Group report ignored many facts regarding ODF and downplays common criticisms regarding OOXML. [The report] seems to me to hold ODF to a higher standard than OOXML and to turn a blind eye to the mechanisms being used in the attempt to have OOXML approved by ISO via ECMA.

First, the Burton Group report claims that ODF is being controlled by Sun. But it's not. It is controlled by a standards body called OASIS. Both the OASIS rules as well as the current ODF TC charter and membership do not allow Sun to control ODF. Certainly, Sun is a strong contributor to ODF and thus has some influence, but Sun is not in a position to control ODF in any way.

Second, the report presents ODF as being simplistic. Yes, ODF tries to be as simple (and thus easy to understand and easy to use) as possible while providing a very sophisticated and mature feature set sufficient to implement a full office suite. The creators of ODF do not see a benefit in making a format unnecessarily complex and incomprehensible. That's why ODF reuses established open standards and common concepts as much as possible instead of using proprietary technologies and reinventing the wheel again and again.

Does this simplicity mean a small and limited feature set? No, ODF 1.0 was already a very feature-rich standard, but the soon-to-be-released version 1.2 will turn ODF into a very mature standard that will cover close to all document-centric usage scenarios.

One argument I hear is that OOXML and ODF are simply serializations of their respective productivity suites (MS Office for OOXML, OpenOffice.org for ODF). Is that a fair characterization?
No, it is not. As another Sun employee, Erwin Tenhumberg, pointed out in his blog quoting a KOffice developer, OOXML's goal is compatibility with one particular application -- Microsoft Office. Therefore, OOXML is very closely related to and dependent on the Microsoft Office implementation. In contrast, ODF is based on the OpenOffice.org XML file format, not the OpenOffice.org implementation.

That's a huge difference because the OpenOffice.org XML file format was designed with application, vendor and platform independence in mind. That is one reason why the OpenOffice.org XML file format reused many W3C standards. In addition, the specification process of the OpenOffice.org XML format has been transparent and public since the foundation of the OpenOffice.org project in 2000. Very early on, other people and organizations were able to help shape the format which was later chosen as the basis for the ODF efforts at OASIS.

From our readers' standpoint, XML is XML. Why should they care if OOXML achieves ISO certification? Won't it just mean that they now have two, distinct ISO-approved, open standards-based XML specifications to choose from?
Well, don't be deceived by the fact "XML" is in the name of one of the specifications. ASCII is ASCII, but ASCII text written in English isn't the same choice as ASCII text written in Swahili! Choice between different implementations of a standard is good for users and consumers; choice between standards typically is a nightmare.

Two standards for the same domain means format conversion, and format conversion means potential data loss and extra costs. The more development resources different vendors have to put on the implementation of conversion tools and file format filters, the less resources will be available for true innovation.

There's been plenty of allegations around Microsoft's efforts to sway the ISO vote, including the confirmed case of a Microsoft office in Sweden promising marketing help for companies that get on board. Has the ISO process been manipulated or compromised by OOXML proponents and what has been your role in countering any such activities?
According to the press, many countries in addition to Sweden have seen similar outcomes, such as a very fast rise in membership numbers. It is hard to make firm statements about how membership and voting numbers were affected by any kind of manipulations, but the whole experience has shown clearly that the process needs to be transparent.

In many countries it was impossible for interested parties to follow the discussions on a national standards body level unless they became members. Therefore, some decisions came as a surprise to many.

I guess over the next few months many national bodies and maybe even ISO will have a closer look at their rules again, and potentially make some changes.

I'm hard-pressed to think of a major software company that is as engaged in open source software as Sun, with products like StarOffice, Solaris and Java all entering the open source sphere. How has Sun balanced the drive for community engagement with the need to guide the direction of the software?
The key is to be a participant in and contributor to each community. There are companies that harvest the work of others and make only proportionately small additions in the process, but Sun has rejected that approach in favor of high levels of engagement.

In addition to the projects you mention, Sun is also highly engaged in a number of Apache projects, in the GNOME community and in many others. When you directly participate and contribute, you are able to guide the evolution of the software as well, and in the most transparent and appropriate way.

Posted by Michael Desmond on 01/29/20080 comments


Subscribe on YouTube