Data Driver

Blog archive

What Data Developers Want for the Holidays

Dino Esposito isn't asking for much from Santa this year. Nothing new or bleeding-edge. In fact, he kind of wants to step back in time, in search of simplified SQL querying:

I'd love to have back a framework that was in beta testing and probably even in production around SQL Server a decade ago: making queries in plain English, like "give me all customers based in WA." The code was amazingly able to make most of them--or at least get close, anyway. I'm working on a simplified version of it--so it would really great to have it from Santa!"

Esposito is talking about English Query, a project for SQL Server 2000 that he was involved in some 13 years ago. Esposito, a well-known developer, book and article author, presenter, trainer and all-around technical expert based in Italy, shared his thoughts with me in an informal survey I took of data developers with equally sparkling credentials, asking what their data development holiday wishes were. Following are some of their thoughts.

Dr. James McCaffrey, who manages training for Microsoft software engineers in Redmond, among many other projects, had this to say:

I get the feeling that there's a lot of flux with MVVM, MVC, and MVP and so my wish (assuming that I'm right and that there is flux) is to see some stability emerge here.

McCaffrey is right about most things, to understate it, and a lot of Microsoft products do seem to be in transition, so some stability in 2014 would be nice.

Brandon Satrom is an HTML5 expert, among a lot of other things, at Telerik.

For us, the biggest wish on the list is for a FULL OData implementation for both MVC and WebAPI. The end result we're looking for is the ability to fully and dynamically query a dataset based on URL parameters. Full OData support would be an awesome start, and if we're extra lucky this year, perhaps Dynamic LINQ integration for Entity Framework as well.

Fellow Teleriker Chris Sells is vice president of the Developer Tools Division at the company.

I think what most data developers want for Xmas is an end-to-end, offline-enabled, client-side and mobile-focused data source stack. The occasionally connected story is a hard one for any of the mobile OSes (and it's no picnic for desktop OSes, either), so something simple, capable, robust and cross-platform for client-side data story is what I'm looking for in my Xmas stocking from Santa!

Noted author Peter Vogel is a principal at PH&V Information Services.

What I'd like is some reliable way to move changes from development to production that won't drive my DBA crazy. Microsoft's new SQL deployment package is great--but if deploying a package on my Web server causes changes in my database, my DBA is going to [Editor's note: just substitute "do painful things to me" here; suffice it to say that Vogel's DBA has some anger management issues], (and I'm opposed to that).

Some reliable tool to estimate "response time under load" would be great. It would (a) take a picture of how busy my database server is over the course of a day and, (b) estimate the response time for all the data access operations in my application (and tie those operations to my UI and services). I'd then specify how much each part of my UI and my SOA will be used in production, and the tool would estimate my response time for each UI component or service operation throughout the day, highlighting those that exceed some allowable limit.

Jeremy Likness, multiple book author and principal consultant for Wintellect LLC in Atlanta, thinks some of his wishes might be coming.

True asynchronous support in Entity Framework and other data providers. Not just wrapping requests in a task, but the actual asynchronous implementation that will scale correctly in highly concurrent environments.

Better/easier extensibility of OData across various producers (that is, WCF 4) and consumers.

Consistent APIs across platforms--that is, a standard data solution for Windows 8, Windows Phone 8, the server, and so on that can be access through a common API so it's not a completely different repository and data access layer for each implementation

Stronger support in database projects for schema changes--that is, I know there is the compare/publish, but an explicit way to write in migrations so you can have push-button updates out of the box, for example, I iterate within a sprint and change a few items, I'm readily prompted to fill in any issues with the schema (default, move data and seed data) and I get two specific outputs: a creation script (start from scratch) and a migration script (upgrade from previous iteration)--again, as part of a build and not an interactive schema compare.

Sean Iannuzzi is a solutions architect for The Agency Inside Harte-Hanks. He took a lot of his precious time to give me an extremely detailed reply. It's great stuff, so I'm sharing it all with you.

What developers want as the perfect data improvement gift would be providing easier data integration from Entity Objects to Data Contracts, Model objects that are exportable as fixed data and including complete model data lists when working with Views in Razor (Data Model Extension).

Entity Objects to Data Contracts or Model Objects

Most of the time, when building Web sites or applications, either data contracts or models are needed to support various differences in the UI versus the data layer. As a result, a mapping exercise is needed to link the two together. I usually use AutoMapper, as it handles this mapping very well, but it would be awesome if this was included as part of the [.NET] framework.

Export Compact Data Elements

Another item related to data development that would be a great feature would be if certain fields in data contracts could be marked for different levels of return options. For example, at times, I may want to lazy load all of my data and only need the IDs and not all of the data associated to the data contacts. What would be awesome would be a way to annotate the data fields with levels that would control when it would be included with the return set. Something such as, deep contract member, medium contract member and light contract member, which could be added at the field level. Light contracts could just include the ID fields, medium would include ID fields and the parent records, and the deep contracts could return all data in the hierarchy structure. What would be really awesome is if this was figured out for you automatically, but that's just a wish and very unlikely.

Fixed Format Export Options

At times, exports are needed for data that's in a fixed field format that's used in a Web application or service. A great feature would be to allow annotations to support how the data could be exported and then, through reflection, pull in the attributes based on the model.

Something such as:

[AttributeUsage(AttributeTargets.All)]
public class FlatFileAttribute : System.Attribute
{
  public int fieldLength { get; private set; }
  public int startPosition { get; private set; }
   /// 

   /// File Attribute constructor to set 
/// the start position and field length ///
/// /// public FlatFileAttribute(
int startPosition, int fieldLength) { this.startPosition = startPosition; this.fieldLength = fieldLength; } }

Razor Data Model Extension

The last feature that I would like automatically included is the ability to map data elements from a hierarchal model to a view without the need of an extension and for the data fields to be included as part of the model. For example, if you have a model with a list of subelements and you are creating them on the view, they will be null be default. To remedy this, I usually create an extension method so that the data is included with the model--so that all model fields are included for an object such as Parent.Children, where children is a collection beneath the Parent object. This would be a nice feature as well.

public static IDisposable BeginCollectionItem(
  this HtmlHelper html, string collectionName)
{
  return BeginCollectionItem(html, collectionName, "", "");
}
public static IDisposable BeginCollectionItem(
  this HtmlHelper html, string collectionName, 
  string prefix, string suffix)
{
  var idsToReuse = 
    GetIdsToReuse(html.ViewContext.HttpContext, collectionName);
  string itemIndex = idsToReuse.Count > 0 ? idsToReuse.Dequeue() : 
    Guid.NewGuid().ToString();
       
  html.ViewContext.Writer.WriteLine(
    prefix + string.Format("
    <input id="\" name="\" value="\" type="\" {1}\??="" off\??="" 
autocomplete="\" {0}.index\??="" hidden\??="" />", collectionName, html.Encode(itemIndex)) + suffix); return BeginHtmlFieldPrefixScope( html, string.Format("{0}[{1}]", collectionName, itemIndex)); }

I'd like to thank all of these guys for taking the time to share their thoughts with you. And I'd like to continue the conversation. What would you like to see in the coming year in terms of data development technologies? Please comment here or drop me a line.

Posted by David Ramel on 12/20/2012


comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube