Microsoft's controversial decision to position the ADO.NET Entity Framework
has generated a lot of backlash among developers who made early bets on LINQ
to SQL, which the company had released with Visual Studio 2008 and the .NET
Framework 3.5. See my complete story here. I received quite a few e-mails from
developers partial to LINQ To SQL and suffice to say, many are felt left at
the alter.
While some I spoke with are coming to terms with it, others are angry. "I
feel stabbed in the back by this but I'm not moving," said Howard
Richards a principal with UK-development firm Conficient, who says he invested
a year with LINQ to SQL and now feels blind sided. "What annoys me most
is Microsoft's cavalier attitude to its developers in this regard. It took
me six months to port my application from a 'homebrew' data layer
to LINQ to SQL."
Yesterday I spoke with Tim Mallalieu, the program manager for both LINQ to
SQL and the Entity Framework, who says Microsoft anticipated the backlash but
said both data access interfaces will be better off for this move. For those
who don't think the Entity Framework is a suitable replacement, Mallalieu
said stay tuned.
"There's some pretty nifty stuff we're doing in the beta 2 time frame that
we are not speaking about as yet, I think it will give you a better experience
and will reduce the anxiety that people have around the Entity Framework," Mallalieu
said. "In terms of capabilities, I think will make the overall integrated experience
of ASP.NET, Dynamic Data, MVC and these other things easier, we did a little
bit of triaging and feedback, there is some valid feedback around complexity
in the Entity Framework and we are doing things to address that. "
What follows is an edited transcript of our conversation.
How widely deployed is LINQ to SQL today?
All indications were that it was like any nascent technology, there was a lot
of interest from an exploratory perspective but there wasn't a lot of significant
-- in terms of the entire .NET eco system -- pushes going into production. There
were a bunch of people kicking the tires, there were some pretty interesting
things going into production. We are still trying to get a better way in general
in the company to gauge technology adoption, but today, I can't give you a definitive
number.
Were you surprised at the reaction?
We knew that this wasn't going to be a popular decision just because LINQ
to SQL is a really interesting technology. It's very nice. The problem
is though, when you make a decision like this, you can either say we don't
want to [tick] off the community, which means that you get a bunch of people
just betting on the technology to a level which will not meet with their expectations
of future innovation, release after release. Or you could actually take the
hit and get the tomatoes thrown at you early in an effort to do right by the
customers. So what we were trying to do, maybe we could have done it better,
is to do right by the customer and set expectations early for where we were
going.
For those who say this was a political decision and not a technology decision,
is that an unfair characterization?
There were a number of political aspects to why we released two technologies
to begin with but in the grand scheme, what we were trying to do with the .NET
Framework right now was to reduce the amount of overlapping technologies that
we keep on dumping out as opposed to increasing the number. We convinced ourselves
internally that it was okay to release LINQ to SQL and Entity Framework because
there was clear differentiation between the two, and the markets we were going
to go after were different.
The reality is if you go look at what people are asking for, the rate of convergence
from a feature set perspective of the two stacks was two releases away from
convergence. So you look at that and say we could spin up two teams of equal
size to go do this work, and within two releases you are talking about two stacks
that look almost exactly alike, or you can say one of these technologies has
already been identified as a strategic piece of a bigger data platform vision.
From a shared investment perspective and technology roadmap perspective, it
seemed like the right thing to do. The problem is because there were some initial
conflicts that people have rumbled about from the history of the two technologies,
it's hard to see that there was actually an attempt to make pragmatic decisions
that were not covered by any political intentions.
If you had two technologies that were covering the ORM space, and one was pretty
nifty, was very fast, lightweight but people were saying they wanted things
like a provider model people were saying they wanted things like many-to-many
relationships, people were saying they wanted the more complex inheritance mapping
but you said there's another technology that has already done that stuff and
we think is the foundation for a broader data platform vision, would you build
those features into the other technology, or would you say, "it sounds like
what people are saying they want all of those scenarios but they want the added
simplicity"? So from a roadmap perspective it just did not make sense to duplicate
efforts from in two code basis.
What can you say to those who feel stabbed in the back or duped by this
change in strategy?
There are few things I can say that will actually make it better. But as soon
as we came to the decision, we felt the best thing to do was to come up early
and tell people so they understood what the situation was as opposed to playing
them alone. I think it would have been much more painful to wait two years to
be talking about why we weren't investing at that level in the technology.
We expect that people will continue to develop with LINQ to SQL, it's a
great technology, we are going to provide with the patterns and practices group
in Microsoft for how to design LINQ to SQL so if you are happy with them, you
just stay with it. If you at some point after using LINQ to SQL want to move
to the Entity Framework, hopefully if you follow the guidance that we will give,
it won't be as hard to move. You don't just go down a path where you've
fallen off a cliff. But beyond that, it's not the kind of message that
I can sit here and say something to you that would be a panacea for the community.
In hindsight do you regret releasing LINQ to SQL and not just waiting for
the Entity Framework to be ready?
I think LINQ to SQL is a very important technology, it's unfortunate how
this is ending up for customers, but I think given where we were from a product
perspective and a technology perspective that LINQ to SQL is really important,
and I think in it's current existence and with the kinds of work that we
expect to do with it moving forward, it's still going to have a good following.
It's just not going to be the be-all-and-end-all enterprise O/RM that has
every little knob and bell and whistle; quite frankly, if you were to add every
little knob and bell and whistle, you'd wake up and find all the elegance
and simplicity of LINQ to SQL would be gone.
Do you see there being new LINQ to SQL features?
We see there being new LINQ to SQL features, I don't know if there will be substantial
new LINQ to SQL features in .NET 4.0. After .NET 4.0 we have every intention
of doing feature work in LINQ to SQL. We are also doing a bunch of bug fixing,
servicing, and that kind of work. LINQ to SQL was developed by the C# team,
when Visual Studio 2008 .NET Framework 3.5 shipped, there was a transition of
the technology into our team. The problem that we had was the transition didn't
come with people, it came with just the technology, and we immediately were
tying to do work for .NET Framework 3.5 SP1, we wanted to add support for the
new SQL Server date types into LINQ to SQL so we focused tactically on SP1 just
on getting some of the features and design change requests that the C# team
said needed to be in to get the service pack done. That meant that when we shipped
the technology we had to officially take ownership of it, which meant we had
to get the technology on boarded. We are different teams and have slightly different
focuses, and we had to get new people ramped up on the technology, given that
.NET Framework 3.5 SP1, released halfway through our development cycle for .NET
Framework 4.0, and given the adoption work I just described, it was really hard
for us to do any significant work in .NET 4.0, but we intend to do feature work
in the future.
Posted by Jeffrey Schwartz on 12/18/20081 comments
There is no shortage of opinion over Microsoft's efforts to point database developers away from its year-old LINQ to SQL data access method to its more recently released ADO.NET Entity Framework.
Microsoft's push, pointed out last week, is certainly not a revelation to those who follow it. But what should one who hasn't followed the machinations of this issue make of it? Or even more pointedly, what about someone who is moving to SQL Server and the .NET Framework?
Telerik CTO Stephen Forte recommends learning raw SQL, so if they use an object-relational modeling tool or either LINQ to SQL or the Entity Framework in the future, "they will know what is going on behind the scenes and use the raw SQL for the reporting solution as well as any complex queries and consider an ORM/LINQ/EF for the CRUD and simple stuff."
While Forte is concerned Microsoft's guidance on how to reconcile its various data access protocols won't be adequate for some time, he believes the shakeout will be organic. "Unfortunately the thing that makes Microsoft great and innovative, its sometimes disparate teams, leads to the confusion in the marketplace," Forte says.
In a blog posting of his own earlier this week , Forte pointed to a survey released by Data Direct Technologies last month that finds that 8.5 percent of.NET apps in production use LINQ to SQL as their primary data access method. "While this number is not huge, you can't ignore these developers voting with their feet by using LINQ to SQL in their applications," Forte says.
What's a LINQ to SQL developer to do? "Throw it all away and learn EF? Use nHibernate? No. The LINQ to SQL developer should continue to use LINQ to SQL for the time being. If the next version of the EF is compelling enough for a LINQ to SQL developer to move to EF, their investment in LINQ to SQL is transferrable to LINQ to Entities. If LINQ to SQL developers are to move in the future, Microsoft will have to provide a migration path, guidance and tools/wizards. (The EF team has started this process with some blog posts, but the effort has to be larger and more coordinated.)"
Microsoft will make sure LINQ to SQL continues to work in the .NET Framework 4.0 and will fix existing issues, wrote Damien Guard, a software development engineer in Microsoft's data programmability group (who works on both LINQ to SQL and the Entity Framework) in a blog posting in October during PDC.
"We will evolve LINQ to Entities to encompass the features and ease of use that people have come to expect from LINQ to SQL," Guard wrote. "In .NET 4.0 this already includes additional LINQ operators and better persistence-ignorance."
That's not to say new features won't shop up in LINQ-to-SQL, he added. "The communities around LINQ to SQL are a continuous source of ideas and we need to consider how they fit the minimalistic lightweight approach LINQ to SQL is already valued for."
Forte says LINQ to SQL developers will be ready to move to the Entity Framework when its feature set is a superset of the former and Microsoft offers migration wizards and tools for LINQ to SQL developers. "If Microsoft is serious about the Entity Framework being the preferred data access solution in .NET 4.0, [they will have to do a few things: "Make EF 2.0 rock solid. Duh. Explain to us why the EF is needed. What is the problem that the EF is solving? Why is EF a better solution to this problem? This is my big criticism of the EF team, the feedback I gave them at the EF Council meeting, is that they are under the assumption that 'build it they will come' and have not provided the compelling story as to why one should use EF. Make that case to us!"
Also, Forte is calling on Microsoft to engage with the LINQ to SQL, nHHibernate and stored procedures crowds.
Still there are many who are not happy with Microsoft's decision to give short shrift to LINQ to SQL, notably Oakleaf Systems' Roger Jennings, who last week said that LINQ to SQL will not go away simply because it is part of the current .NET Framework. Forte takes issue with that thinking.
"Just because something is in the framework is no guarantee that it will have a bright future," Forte said in his e-mail to me.
Jennings points to others who are weighing on this issue as well, such as Stu Smith, a developer at UK-based BinaryComponents Ltd.:
There's no one correct way to write an ORM. Different applications have different requirements. A general purpose ORM will never satisfy 100 percent of developers. Fine. I'm happy with that; there's a nice market for specialist providers.
What I'm not happy with is that while LINQ to SQL seemed to make 90 percent of developers happy, it's being replaced with LINQ to Entities that (judging by the feedback I've seen) makes far less developers happy.
I'm fine with the ADO.NET team writing a solution that fills that 10 percent gap or otherwise augments LINQ to SQL. I'm not happy with them replacing a 90 percent solution with a specialist 10 percent solution.
In the end, how this will all turn out remains to be seen, Forte points out. "We are still at the station buying tickets (to an unknown destination)."
What's your opinion? Drop me a line at jschwartz@1105media.com.
Posted by Jeffrey Schwartz on 12/10/20081 comments
Developers are reckoning with the fact that Microsoft's
LINQ to SQL data access protocol is getting short
shrift in Redmond these days as the company continues
to shore up its focus on the second version of
the ADO.NET Entity Framework.
Some would argue LINQ to SQL was DOA when it
arrived in the .NET 3.5 Framework just over a
year ago, but in Microsoft's recent messaging
it leaves little doubt that the company doesn't
have any major plans to further enhance LINQ to
SQL. For many, the blog post by Tim Mallalieu,
the program manager for both LINQ to SQL and the
Entity Framework during PDC sealed its fate.
"We're making significant investments in
the Entity Framework such that as of .NET 4.0
the Entity Framework will be our recommended data
access solution for LINQ to relational scenarios,"
he
wrote on Oct. 29. Two days later, as people
were returning home from PDC, he added: "We
will continue to make some investments in LINQ
to SQL based on customer feedback."
Many are saying that is code for LINQ to SQL
is finished. "It is dead as a door knob,"
said Stephen Forte, chief strategy officer at
Telerik Inc. and a Microsoft regional director,
speaking at a .NET User Group Meeting in New York
two weeks ago.
To put Forte's remarks in context, he was giving
a talk on the various data access alternatives,
including the Entity Framework, ADO.NET Data Services
with REST, and ASP.NET Dynamic Data, among others.
"In my opinion there is going to be a shakeout;
the first casualty will be LINQ to SQL,"
Forte told the group.
For his part, Mallalieu explains in his Oct.
31 post that Microsoft has been looking for ways
to migrate both LINQ to SQL and LINQ to Entities.
"At first glance one may assert that they
are differentiated technologies and can be evolved
separately," Mallalieu wrote at the time.
"The problem is that the intersection of
capabilities is already quite large and the
asks from users of each technology takes the products
on a rapid feature convergence path."
Andrew Brust, director of new technology at twentysix
New York, said given both are relatively new,
Microsoft's moves shouldn't prove disruptive to
most developers. "Both are new and neither
has gathered so much steam that the victorious
emergence of the other could be viewed as a huge
imposition," Brust writes in an e-mail. "To
me it's like Blu Ray winning out over HD DVD.
While people who bought HD DVD players and discs
are not happy about the outcome, they represent
a small group of early adopters, all of whom were
warned of and understood the risks in making an
early commitment."
Roger Jennings, principal of Oakland, Calif.-based
Oakleaf Systems, authored this month's Visual
Studio Magazine cover
story covering object/relational mapping using
LINQ to SQL. Jennings explains while Microsoft
may abandon any significant enhancements of LINQ
to SQL, it is forever part of the .NET 3.5 Framework,
and despite Microsoft's messaging on the next
version of the Entity Framework, many developers
may still be inclined to work with LINQ to SQL.
"LINQ to SQL is alive and well," Jennings
says. "They can't remove it because it's
part of the .NET 3.5 Framework."
Jennings believes many developers will continue
to use LINQ to SQL, given the direction Microsoft
is taking Entity Framework v2. He for one, laments
Microsoft's
announcement last month that v2 won't support
N-Tier architectures.
Jennings says Microsoft appears to be backing
off on others features that were presumed to be
slated for EF version 2. In a blog
posting Tuesday, Microsoft explained how developers
should migrate stored procedures developed with
LINQ to SQL to EF using Visual Studio 10.
But, says Jennings, Microsoft made it less certain
than earlier messaging that it will make the EF
v2 cut. "What they are saying now is support
for stored procedures might be implemented in
EF v2, instead of will be," Jennings says.
"Basically what they are doing is back peddling
on their original commitment."
Jennings also pointed to the LINQ to SQL Designer,
which allows developers to map stored procedures
that return scalars. While acknowledging that
such automatic code-generation of methods is missing,
Microsoft is now saying "this is something
that is being strongly considered for the next
release of Entity Framework." Jennings said
it was presumed that would make the EF v2 release.
"That's called it's fallen off the list,"
Jennings says. "The upshot is it appears
the team is paring their list of what they are
going to implement in EFv2 form what the original
plan was."
As a result, Jennings believes many developers
might still opt to use LINQ to SQL via a Visual
Studio add-in developed by Huagati Systems, based
in Bangkok, Thailand. Huagati's DBML/EDMX adds
menu options for synching LINQ to SQL designer
diagrams with changes in the database.
"It's ironic that a lone developer can provide
add-ins for features that the ADO.NET Entity Framework
v2 team aren't even proposing for their upgrade,"
Jennings says.
What's your take on this? Drop me a line at jschwartz@1105media.com.
Posted by Jeffrey Schwartz on 12/04/20083 comments
Database administrators and developers converged on Seattle for this week's annual Professional Association for SQL Server (PASS) conference, where Microsoft is talking up its recently released SQL Server 2008 and the forthcoming upgrade, code-named "Kilimanjaro." You can read all about that
here.
One of the key advances that will enable Kilimanjaro is "Madison," the code name for the technology that will allow SQL Server to handle massive parallel processing. Microsoft's acquisition of DATAllegro back in September is providing the key assets in developing Madison.
It turns out that much of that work is happening in Madison, Wis., where Microsoft back in March announced its database research lab, called the Jim Gray Systems Lab, located at a facility not far from the university. To run the lab, Microsoft brought on board as a technical fellow David DeWitt, who spent 32 years in academic research at the University of Wisconsin-Madison. DeWitt will be making his first public appearance as a Microsoft employee in front of his largest audience ever at PASS on Friday in a keynote address.
I had the opportunity, joined by my colleague Kathleen Richards, to talk with DeWitt this week. Here's an excerpt:
Was this lab built from the ground up?
I am still building it. It's a lot of work. I currently just have three staff members; we'll be finding up to six graduate students next semester. I have some open staff positions but I am very fussy on who I hire. I'm never going to have 50, and the goal is to have 10 to 15 full-time staff, mostly Ph.D.s and some masters students, but people that like to build systems. I am a real hands-on systems builder.
What are you working on?
We are working with the DATAllegro team to look at parallel query optimization techniques. Optimizing queries is hard, optimizing for a scalable database system is even harder, and query optimization is something I've been interested in for a long time. We have one project that involves looking at some optimization techniques that will come out in a future release of the DATAllegro product.
What role did you have in proposing, suggesting the DATAllegro acquisition?
Zero. I had absolutely no role in the acquisition process. I knew about it soon after I joined, but Microsoft is very careful about how it does acquisitions these days. I was not involved in any way in the technical decision on whether to buy it or not. But I think it's a great acquisition. They've got a great product and I think Microsoft's expertise will be able to take it to an entirely new level. It's a great team. We were there last week. We are excited about working with them. It was like a big Christmas present as far as I am concerned because now, all of a sudden, I am working at a company that has a really seriously scalable parallel database system. Having built three in my life, getting a chance to work on a fourth [is] just like Christmas.
How do you see taking it to the next level?
First of all, replacing Ingres with SQL Server will certainly drastically improve the kinds of performance we should be able to get. SQL Server is a modern database system and Ingres is an old system. The DATAllegro system avoided using indices because the indexing in Ingres was not very effective. I think we'll get all of the benefits SQL Server has as the underlying engine. We're going to get this huge boost, and I think that the DATAllegro is a startup and they have a great system but it's a startup, and there are a lot of things that were done in the area of query optimization [that] I think we can improve on. Having built a number of parallel database systems in the past, I think we can offer something when it comes to optimization of queries that will allow us to scale even higher.
How else do you see SQL Server advancing as a platform?
SQL Server will advance as a platform by using DATAllegro as the base. Will DATAllegro make SQL Server more scalable? Absolutely. I think query optimization is the main unsolved problem in data warehousing today. I think we know how to build parallel database systems that scale to hundreds of thousands of nodes. DATAllegro already has one customer that's 400 terabytes. Ebay has a competitor's system that has 5 petabytes. But there are really serious challenges of optimizing queries for hundreds of nodes and thousands of spindles. I think those are the opportunities that a team like mine can get its teeth into and make some forward progress. Query optimization is something that will come for a very long time, and we have some ideas for some new models for optimizing and executing queries that we will be exploring as part of the DATAllegro process.
You mentioned it can take 10 years for research to make it into a commercial product. Is that timeframe changing?
That's one of the goals of the lab. One of our ideas in setting up this lab was to have a much shorter path from the innovation by the graduate students and by my staff, into the product line. That's one of the reasons I am not part of Microsoft Research, even though I'm an academic researcher. I am part of the SQL Server organization and we intentionally put this lab as part of the SQL Server organization so that we had a direct path from the university into the SQL Server organization. It would not have made much sense to try to do this lab as part of Microsoft Research because we don't have a direct path.
What will you be talking about in your keynote later this week?
The other keynotes, they get to introduce new products and do fancy demos. I am just the academic guy. The talk is really going to be about the key components of a parallel or scalable database system, how partitioning works, here's the relationship between partitioning indices, here's what happens to a SQL query when it gets compiled on scalable parallel database systems. It will really be a lecture on the fundamental technologies behind today's scalable database products.
If you had to sum up your key message, what is your vision for where you'd like to see your efforts at Microsoft take the SQL Server platform moving forward?
I'd like to have us become the world leader in data warehousing. I think that we have a great SMP, product, it's easy to use, it's got great performance. We can take on Teradata. I don't see any reason why we should not become the premier solution for very large-scale data warehousing.
Posted by Jeffrey Schwartz on 11/19/20081 comments
Among many pressing questions that came up at last month's Professional Developers
Conference (PDC) was whether Microsoft's new Dublin app server extensions will
replace BizTalk Server. Microsoft says that's not the plan but it is important
to understand what Dublin is.
Microsoft released the first CTP of its new distributed application server
extensions to Windows Server, code-named Dublin, at PDC. Microsoft first disclosed
its plans to build these extensions in concert with the introduction of its
new modeling platform, code-named Oslo, last
month.
According to Microsoft, Dublin will incorporate key components of the new .NET
Framework 4.0 -- specifically the second iterations of Windows Communications
Foundation (WCF) and Windows Workflow Foundation (WF). In addition to improving
scalability and manageability, Microsoft said it will allow Windows IIS to function
as a host for apps that use workflow or communications.
I attended a session at PDC that outlined Dublin, where Product Unit Manager
Dan Eshner explained where Dublin fits. In short, if the modeling tool called
Quadrant in Oslo lets developers create models or domain-specific languages
(DSLs), think of Dublin as one deployment repository for those models. Dublin
is scheduled to ship roughly three months after Visual Studio 2010, Eshner said,
and will initially extend Windows Server, though it will ultimately be built
into future versions of the platform.
"Dublin really is a hosting environment for WF and WCF services,"
Eshner said. The goal, he added, was to take the heavy lifting and skill requirements
out of invoking WCF and WF services. "You can make these services work
without Dublin, you just got to do some stuff. You've got to get all the configs
set up and you got to do some work, create services out of them," he said.
Within Visual Studio Dublin will add project templates, and in the IIS Manager
it will add WF and WCF management modules. It also adds discovery within the
hosting environment, a SQL persistence provider, application monitoring, and
adds versioning, partitioning and routing to messaging.
But questions abound regarding Dublin. To my original point, several people
were trying to get a grasp on whether Dublin will ultimately subsume BizTalk
during the Q&A portion of the session. Microsoft architect Igor Sedukhin
said he doesn't see that happening. "Dublin is not intended to be an integration
server at all," he said. "We aren't trying to put all the adaptors
in Dublin. BizTalk is really focused on that integration scenario."
Cutting to the chase, one attendee asked: "Three years from now, will
BizTalk as a product exist, and if it does, why would I want to pay for it?"
Yes, it will still exist, Eshner said. "We really believe that there is
a ton of scenarios on BizTalk that we will not address in Dublin, or you would
have to do a whole bunch of work to make Dublin work in the same kind of way
that BizTalk does," he said, adding Dublin won't have the transforms and
adaptors found in BizTalk. "BizTalk as an integration server is much more
powerful than what you get with an app server like Dublin."
Eshner and his team addressed a few more questions regarding Dublin, among them:
To what degree will Dublin scale to support enterprise-class applications?
That will be more clear over the next six months. Though probably not as scalable
as some would like, partners should be able to close the gap.
If Dublin is going to rely heavily on persistence, will it require shops
to purchase SQL Server?
The details are still being worked out, but to scale, that will probably be
a safe assumption.
What about transactions beyond the existing transaction services in Windows?
It's not clear how much will get added into version 1.
Will developers be able to deploy both locally and to Azure?
The Dublin team will be working with the IIS team using MS Deploy (Microsoft's
new IIS Web deployment tool) to see if it can be leveraged. "That's a great
thing to add to our future list to see how we can do that," Eshner said.
Have you looked at the Dublin bits? If so drop me a line at jschwartz@reddevnews.com
and let me know what you think.
Posted by Jeffrey Schwartz on 11/12/20080 comments
The names keep on changing at Microsoft. This week, SQL Data Services or SDS
(formerly SQL Server Data Services or SSDS) became part of a broader group called
"SQL Services." The technology is exciting even if the naming conventions
leave some developers scratching their heads.
SQL Services is part of the rollout for Windows
Azure -- another name that got a lot of people talking about Microsoft's
inability to communicate its promising technology to developers...or the world
at large, for that manner.
"I don't know how they come up with these names," voiced one Microsoft
partner during his presentation. "I just hope I'm pronouncing it right."
If he did, he was ahead of several Microsoft presenters and even some keynoters
who offered several "variations" of Azure in the same speech.
SQL Services is the data storage component of the Azure Services Platform for
building cloud-based apps. Just for showing up -- and for paying the $1,000-plus
conference fee -- PDC attendees got the coveted "goods," which included
a preview
of Windows 7, the first Visual
Studio 2010 CTP and an invitation to register for components of the Azure
Services Platform, including SDS provisioning.
Redmond Developer News Executive Editor Jeffrey Schwartz and I got
to sit down with Dave Campbell, the Microsoft Technical Fellow leading the SDS
effort. We didn't really touch on the name change except to confirm it, but
we did ask him all about Microsoft's evolving data platform. Look for our Q&A
in the Nov. 15 issue of RDN. And see "PDC:
Microsoft's Cloud-based SQL Services Redefined" for more data-related
announcements at PDC.
Is the economic climate piquing your interest in cloud-based utility services?
What would you like to see in SDS? Weigh in on SDS and Microsoft's naming habits
at krichards@reddevnews.com.
Posted by Kathleen Richards on 10/29/20081 comments
When Microsoft
outlined its BI strategy for future releases of SQL Server at its Business Intelligence Conference 2008 in Seattle last week, the company put forth an ambitious road map that looks to broaden the reach of its data management platform.
Ted Kummert, corporate vice president of Microsoft's Data and Platform Storage division showcased three efforts in play. First is the next release of SQL Server, code-named "Kilimanjaro," due out in 2010 and intended to further penetrate the enterprise database market owned by Oracle and IBM.
Second is Project "Gemini," a set of tools Microsoft is developing with the aim of bringing BI to a wider audience of information workers. The third project he outlined was Madison, aimed at taking the technology Microsoft acquired from DATAllegro, an Aliso Viejo, Calif.-based provider of data warehouse appliances and developing its own to be sold as hardware.
In addition to trying to up the ante with enterprise deployments, perhaps more notable about Kilimanjaro is that "it signifies a greater emphasis towards supporting the needs of end users by leveraging the capabilities of SQL Server and the ubiquity of Excel," writes Ovum senior analyst Helena Schwenk in a bulletin to clients.
"These are unchartered waters for Microsoft," Schwenk warns. "While Excel is a pervasive BI tool, it has certain technical limitations that prevent it from being used as a full-blown reporting and analysis tool."
Despite the challenge, the next release of SQL Server promises to address these limitations, she adds. If Microsoft makes its delivery goals and can price it competitively, Schwenk believes Microsoft could make further inroads into the BI market at the expense of other BI vendors.
Still, IBM, Oracle and SAP aren't sitting still. With all three having made huge acquisitions over the past year, the battle to broaden BI is still at an early state of evolution.
Posted by Jeffrey Schwartz on 10/15/20081 comments
As I pointed out in
my last post, Microsoft is rolling VSTS Database Edition into VSTS Developer Edition, and effective immediately those with Microsoft Software Assurance licenses can use both for the cost of one.
The company's goal: get more traditional developers delving into the database and vice versa. But Randy Diven, CIO of Modesto, Calif. produce supplier Ratto Brothers Inc., raised an interesting question:
"Will I end up with two installations, one being the Development install and one being the Database install or are the product features designed to integrate with each other?," Diven wrote. "I am not overly excited about installing VSTS twice on my machine."
After all, VSTS is a big install and the last thing he wanted to do is end up rebuilding his workstation. "I am very interested in an integrated solution," Diven said.
Not to worry, said Cameron Skinner, product unit manager for Visual Studio Team Studio. "They integrate with each other," Skinner said in an e-mail. That helps, but Diven said he'd like to see an integrated install or clearer instructions that address interactions with Visual Studio 2008 SP1. "This is a big deal for VSTS programmers," he replied.
Skinner agreed and said that problem will go away with the next release. "This is a point-in-time problem with the current products and making them available given our decision to merge the SKUs," Skinner added. "Once we ship VS and VSTS 2010, the install will be integrated."
Posted by Jeffrey Schwartz on 10/03/20080 comments
With Microsoft this week adding more information about its plans for the next release of its Visual Studio Team System, it bears noting that those that were not keen on upgrading from SQL Server 2005 to the new 2008 release may need to reconsider that stance.
That's because those who upgrade to TFS "Rosario" will need to use SQL Server 2008, as reported by my colleague Redmond Developer News senior editor Kathleen Richards, who points to VSTS lead Brian Harry's blog. "That was a controversial decision, but it is a final decision," Harry writes. "The primary driving force behind it is that the Report Server feature in SQL Server 2008 is sooooo much improved over that in previous versions that we simply could not pass up taking advantage of it for Rosario."
But considering the substantial new reporting capabilities in SQL Server 2008 and the likely release date of VSTS 2010, there's a "compelling" case to be made for Microsoft's decision, according to Andrew Brust, chief of new technology at twentysix New York.
"While it's a tough call to tether one new release to another, and doing so risks alienating some users, it's also true that if Microsoft released a version of TFS that didn't take advantage of now-released SQL Server 2008 technology, that a year or so post-release, Rosario would look under-featured," Brust responded in an e-mail, when I asked how customers might react to this latest change.
Presuming Microsoft upholds its practice of including a SQL Server Standard Edition in TFS moving forward, there are organizations that have strict policies about allowing new releases into their shops. Brust believes that, too, should not be a show stopper for VSTS shops. "Even in corporate environments where new versions of SQL need to be approved before deployment, one could make the argument that SQL 2008 is an intrinsic component of TFS Rosario and would thus qualify for a "waiver" of sorts."
Another point worth noting: Microsoft is rolling VSTS Database Edition into VSTS Developer Edition, and effective immediately those with Microsoft Software Assurance licenses can use both for the cost of one. The goal: get more traditional developers delving into the database and vice versa, said Dave Mendlen, Microsoft's director of developer marketing in an interview last week.
"Developers are more hybrid today than they were in the past ... this needs to work not just with the core source code but also with the database becoming more and more important to them," he said.
What's your take on these latest moves? Drop me a line.
Posted by Jeffrey Schwartz on 10/01/20080 comments
In its latest bid to show that the Windows stack is suited for the most mission critical applications, Microsoft's release of
Windows HPC Server 2008 this week promises to extend the limits of Redmond's data platform.
I attended the High Performance On Wall Street conference in New York, where Microsoft launched Windows HPC Server and the timing was quite ironic. On the one hand, Wall Street is undergoing a historic crisis -- indeed, the landscape of the entire financial services industry has unraveled. Meanwhile IT vendors made the case for performing complex risk analysis across large clusters that could yield better transparency and performance using methodologies such as algorithmic trading.
For its part, Windows HPC Server 2008 will push the envelope for those looking to run such applications on the Windows platform. But with everything that's going on, it will be interesting to see whether the potential rewards of such capabilities increases investment in high-performance computing or whether the risk becomes more than organizations are willing to bear.
Posted by Jeffrey Schwartz on 09/24/20081 comments
If you're a database developer, you may be wondering how SQL Server Data Services will affect how you build data-driven applications. SSDS is Microsoft's cloud-based repository that is available for testing through the company's community technology preview program.
When it comes to Microsoft's emerging cloud strategy, Microsoft is giving a lot of airplay to SSDS because it epitomizes the company's mantra that enterprise customers are most likely to adopt the hybrid approach to premises and cloud-based services, which it calls "software-plus-services."
To be sure, Microsoft is not currently targeting SSDS for transaction-oriented applications, though if you are developing or administering OLTP applications, SSDS could become a repository for referential and/or backup data.
But of all the new data-driven technologies Microsoft is offering these days, SSDS will be viewed as the simplest, according to Jim Williams, an account architect at Microsoft. Williams gave a session on SSDS at VSLive! New York last week.
"You're not going to write SQL against SQL Server Data Services," Williams said. "You are not going to see tables, you are not going to see foreign keys, you're not going to see the concept of referential integrity that you are used to."
Among some questions Williams addressed in his session:
Will SSDS support transactions?
There's no transaction semantics in this offering today. There certainly could be one in the future... Since a SOAP interface is supported, it would certainly be possible to offer Web services transactions.
If it doesn't need a SQL interface, what's on the client?
Any technology that knows how to do SOAP or REST. The samples in the documentation cover Ruby, Java, and C#.
How do will developers write queries against SSDS?
If you know LINQ, you know more than you need to make queries against SSDS the way it is today.
If you're interested in SSDS, you won't want to miss the detailed TechBrief by Roger Jennings, principal with OakLeaf Systems, which is in the current issue of Redmond Developer News.
Posted by Jeffrey Schwartz on 09/15/20080 comments
Oslo, the code-name for Microsoft's next generation modeling platform championed by chief software architect Ray Ozzie, is shaping up to have a prominent role at next month's Professional Developers Conference in Los Angeles.
While tidbits of information continue to unfold, it became apparent at this week's VSLive! New York show that Oslo will be one of many key technologies Microsoft showcases and that it will center around Microsoft's BizTalk Services, as several speakers pointed out (not to be confused with Microsoft's BizTalk Server, for which the company is planning to upgrade).
Douglas Purdy, a product unit manager at Microsoft, revealed in a blog posting earlier this week that he will giving a presentation on Oslo. In his posting he broke it down into three components:
- A tool that helps people define and interact with models in a rich and visual manner
- A language that helps people create and use textual domain-specific languages and data models
- A relational repository that makes models available to both tools and platform components
"That is it," Purdy wrote. "That is all Oslo is. Oslo is just the modeling platform." The question is what does that mean to .NET developers? Speaking during a panel session at VS Live! Brian Randell, a senior consultant at MCW Technologies, is that it will broaden programming to a much wider audience.
"His vision is that everyone can be a programmer," said Randell. "The idea behind this is they want to make building complex systems easier, and where the big word is modeling."
Still there was a fair amount of skepticism at VSLive! about Oslo as well. "It's important to realize that this whole Oslo initiative is an umbrella term that's talking essentially about a 10 year vision," said Rockford Lhotka, principal technology evangelist at Magenic Technologies, who was on the same panel.
Microsoft's announcement yesterday that it will join the Object Management Group was also a sign that Oslo will embrace the Unified Modeling Language.
Posted by Jeffrey Schwartz on 09/10/20081 comments