LavaBlast Software Blog

Help your franchise business get to the next level.
AddThis Feed Button

A first in the competitive stuff-your-own teddy bear industry

clock December 9, 2007 11:45 by author JKealey

We have just released a new feature on the Teddy Mountain Stuff-Your-Own Teddy Bear Website which we believe is an industry first. Teddy Mountain sees itself as the most innovative teddy bear franchise and they use LavaBlast's technology for both their supporting infrastructure and client facing applications.
We've done a number of industry-specific solutions for this franchisor:

  • Interactive Kiosk with webcam and touch screen to create your own teddy bear adoption certificate.
  • Point of Sale: Integrated with the kiosk, our point of sale is easy to use and reduces training costs.
  • Website: Online sales via an e-commerce engine with a few teddy bear industry-specific features.
  • FranchiseBlast: Our centralized management and collaboration portal which ties everything together, providing a single environment to modify the product line, store pricing, collaborate with others, share documents, view reports, etc.

Today, we augmented the integration between the points of sales distributed across the world and the Teddy Mountain website.  We are now allowing members of the frequent buyer program to view adoption certificates they have created in brick and mortar stores…. online! Simply put, existing PaLS members from a select number of stores are able to see their frequent buyer card balance and birth certificate history on the Teddy Mountain website. We think that no other franchisor in the teddy bear industry has done anything similar and are proud to see Teddy Mountain lead the way. Of course, for privacy concerns, only those who sign-up for the program will have their pictures made available online to distribute to their friends and family.

We feel that, in the long term, this feature will improve gift card sales from out-of-town family members as the donor can receive visual feedback from the recipient, via the Internet.

If the feature attracts some interest, we are open to implementing new features such as integration with FaceBook or using Google's OpenSocial API. We shall also add features such as emailing certificates to friends/family with a greeting.

Debug Visualizer for SubSonic Collections

clock December 7, 2007 20:00 by author EtienneT

Wouldn't it be nice to be able to see a SubSonic collections while you are debugging, just like the DataSet debug visualizer? Because we often need such a tool to debug our SubSonic collections here at LavaBlast, I've created a small Visual Studio Debug Visualizer that you can use to see what your SubSonic collections contain.  How does it work?  While you are debugging, put a breakpoint somewhere and simply hover your mouse over a SubSonic collection variable.  You should see something similar to the screenshot shown above.

Once the tooltip has appeared, you can click on the small magnifier to open the debug visualizer.  You'll see a DataGridView similar to the picture on the right. This will show all your SubSonic objects in a easy to navigate list.  Basically my code takes the SubSonic list and transform it in a DataTable.  I can't use the method ToDataTable() from the SubSonic AbstractList because this requires access to the provider's configuration, and the JIT executing the debug visualizer doesn't have access to it.

To use this simple tool, download this file SubSonicVisualizer.dll (8.00 kb) and put it in your [My Documents]\Visual Studio 2008\Visualizers\ folder, creating the directory if it doesn't already exist. (You might want to double check your file system permissions on this folder, as well.) I compiled this for Visual Studio 2008.  I have also included the source code here (Visual Studio 2008) in case anyone wants to enhance it. (386.36 kb)

If you are using Visual Studio 2005, I think this file will work for you, but I have not tested it: SubSonicVisualizer2005.dll (8.00 kb). I simply changed the reference from Microsoft.VisualStudio.DebuggerVisualizers.dll version 9.0 to 8.0 and, from what I have read, it should work in VS.NET 2005.

kick it on

Webcam surveillance in FranchiseBlast

clock December 6, 2007 23:11 by author JKealey

Point of sale surveillance With the Teddy Mountain teddy bear stuffing franchise, we’re fortunate enough to work with a technically savvy franchisor. Our website describes various elements that we’ve produced for their great franchise system. LavaBlast is proud to help centralize the Teddy Mountain franchise operations and bring the franchise offering to the next level. We’ve been working with them on various solutions since their early beginnings and grown our tailor-made solution with their growing needs.

Today, we launched a simple feature that allows franchisees to remotely monitor their stores, using security webcams that were installed when the stores opened. This is something they could already do, but we’ve integrated it into our solution so that they only have one place to go on the web for their product pricing, reports, etc.

The integration into our solution was a quick job thanks to the infrastructure we already had in place. We do little side-projects like this just for fun, to clear our minds!

PS: The Teddy Mountain store decor is absolutely fabulous. The Imagination Retail Group has found a great balance between visual appeal and supporting infrastructure.


blast it on Franchise NewsBlast

Multitasking while Visual Studio builds

clock December 5, 2007 13:25 by author EtienneT

Are you using Visual Studio and your solutions are taking longer than 10s to build? Why not put those 10s to good use and do something other than stare emptily at your poorly decorated office? Maybe you are worried that your build will finish before you expect it to and you don't want to lose precious time? You don't have to worry about this anymore thanks to what I found today! You can now feel free to peruse a few blog posts while Visual Studio is building!  [insert witty comment about how men have difficulty multi-tasking here]. 

Robert Robins posted this on his blog recently and I didn't even know it existed! You can configure VS.NET build sounds that information you where the build is done!

Go to the Control Panel, Sound and Audio Devices, Sounds Tab, Microsoft Visual Studio, and configure the sounds you'd like to play when the build succeeds/fails. LavaBlast is not responsible for what your coworkers do to you after configuring Pee-wee Herman or a bomb siren as your default sounds.

Happy coding!





kick it on

SubSonic Limitations - Part 2 (aka: Knee deep in … SubSonic. )

clock December 4, 2007 22:02 by author JKealey
Knee deep in snow.

After my recent post  asking for the most elegant way to support multiple databases with the same schema at runtime, I received some good pointers in the SubSonic forums from dbr, a forum user. In the end, I admit should have done my homework before posting.

One elegant solution to change the SubSonic Provider/ConnectionString at runtime makes use of SharedDbConnectionScope. I personally do not like this solution, as I prefer my code to explicitly state what it’s doing via its properties or arguments instead of relying on contextual information.  I was also concerned about how it works with regards to concurrency and I did a little digging. Looking at the code, I discovered it internally uses the ThreadStatic attribute which seems like a godsend at first, but further investigation reveals the implementation may be flawed. I did see people complain that it didn’t work for them, but don’t know if it is related to the ThreadStatic attribute. I do not fully trust this technique, but I may wrong as I'm far from an expert in concurrency.

Returning to Dbr's suggestion, he simply generates different providers at runtime for each connection string. This sounds simple if you can manage to modify the ProviderName property on the collection (ActiveList) or object (ActiveRecord) every time you load/save from the database. Without resorting to SharedDbConnectionScope, you can't use the auto-generated constructors because they fall back to the default provider which is hardcoded in the generated object’s schema.

The elegant implementation to encapsulate loading/saving from the database is to use a controller, as would be suggested by the MVC design pattern. I have not yet played with the new MVC templates provided by SubSonic, but we already use a good generic SubSonicController here at LavaBlast.

I wanted to re-write my object loading/saving code using this new solution to get rid of my inelegant concurrency locks. Although obvious in appearance, I encountered a few little hiccups along the way and I thought I'd post my findings here.

Limitation 1: You can't create an object by specifying its ProviderName in an ActiveRecord constructor using the default generated code.

  • Workaround: You need to load it using a collection, which supports the ProviderName.
  • Workaround 2: Use SharedDbConnectionScope
  • Workaround 3: Change the code templates to add new constructors.

Limitation 2: You can't use a collection's Where parameter to load your data (via its primary key or other filter), because of incomplete framework code. Hopefully this will be resolved soon (see issue 13402).

  • Workaround: Copy-paste the code used by internally by the Collection, but pass in the extra ProviderName parameter to the new Query.

Limitation 3: You can't specify the ProviderName property on an ActiveRecord because the setter is marked as protected.

  • Workaround: Change the code templates and add a public method/property that sets PropertyName from within the class.
  • Use SharedDbConnectionScope.

Limitation 4: When you load an ActiveRecord by running a query or by loading a Collection, the ActiveRecord does not inherit the ProviderName from the Collection/Query. This is probably due to Limitation 3.  

My current prototype no longer uses the c# lock keyword. I create instances of a controller, passing in which connection string name to use. All database loading/saving is done via this controller, for which I have attached sample code extracts. I managed to get object loading code to my liking, but I had to resort to SharedDbConnectionScope for saving. Once the minor limitations in the framework are resolved, I will be more comfortable with the code.

In summary, I did manage to get a working prototype and I have attached the relevant portions of code that works with data from the appropriate database (chosen at runtime). Hope this helps!

SubSonic Limitations

clock November 30, 2007 10:23 by author JKealey

Question: What is the most elegant way to reuse code generated by SubSonic for tables that share the same schema, in different databases.

  • Ideally, I would have a shared schema definition, generated once, and seamlessly integrated into the code generated for each separate provider.
  • Creating a separate DataProvider for a subset of tables reduces the amount of code that is generated, but is not very convenient to use if you do not use the same namespace for all your projects.
  • Creating a separate DataProvider does not solve the problem of database selection at runtime.

Multiple Databases, Same SchemaLavaBlast's integrated solution for franchise management solution operates on a centralized database and a data warehouse which collects data from all our points of sale. Recently, we decided we wanted to create some management pages for our various e-commerce websites in our centralized portal. Because our recently developed e-commerce backend is the same as our point of sale (reuse++), we automatically obtained features like centralized product line and pricing management for our store fleet (featureSynergy++). However, we wanted to be able to process website users and orders from this same central portal, not on each individual site.

My first question was how do we get the union of the data from the same table in multiple databases? One solution would be to merge these into the data warehouse, but we didn't want to go through complex infrastructure to bring the data into the warehouse and push the changes back out when necessary. I suppose having everything in the same database in the first place would be a solution, but it is not how we architecture our systems. SQL Server Replication might be useful, but it is not bidirectional with SQL Server Express. I can easily write a view that would be a UNION query that would merge the data from the set of databases, but that would be a maintenance problem. For each table, I would have to hardcode the list of databases.

I wrote a quick stored procedure that builds the UNION query from a table of Website to DatabaseName mappings, given a few parameters. It is inefficient and is not strongly-typed (hence it feels dirty) but given the volume of data on these sites, it is good enough for now without being a maintenance pain. Passing in a in a few parameters to the stored procedure, we can filter the rows before the union, we can improve performance. I am curious to know if there are more elegant solutions to this problem.

Anyhow, with this first problem solved, we could bind our GridView to a DataTable produced by the execution of a StoredProcedure and see the merged results. However, because we have a standard infrastructure that makes good use of SubSonic magic for filtering, paging, and sorting, this was not enough. Our infrastructure only works on views or tables in our central database, not on arbitrary results returned by stored procedures. Therefore, SubSonic did not generate any code for the merged tables, in the central database. Still, thanks to the SubSonic Provider model, we managed to load a collection based on the type defined in one DataProvider (point of sale) using data provided by the stored procedure, in another DataProvider (central server). Below, an example without any filtering, sorting or paging.

SubSonic.StoredProcedure sp = SPs.WebsiteUnionOfTables(POSBOLib.Generated.ShoppingCart.ViewWebUser.Schema.TableName, "*", string.Empty, string.Empty);
POSBOLib.Generated.ShoppingCart.ViewWebUserCollection coll = new POSBOLib.Generated.ShoppingCart.ViewWebUserCollection();

With a bit more work on the stored procedure, we can make it efficient, but we don't want to use T-SQL all that much, to make everything easier to maintain. (We could use CLR stored procedures, but that's another story).

My second question was how am I going to update this data? When manipulating the data, I know from which database it comes from thanks to an additional column appended by my stored procedure, but I cannot create an updatable SubSonic object with this, and I don't feel like writing SQL anymore, now that we use SubSonic. However, the DataProvider name is a hardcoded string in the auto-generated code… and changing the templates to pass in extra parameters looks like too much work in addition to breaking the simplicity of the framework.

Having played with the DataProvider model, one idea that came to me was to switch the provider context dynamically at runtime. The framework doesn't support this, so I had to hack it in and make sure all my data access was contained in critical sections (using the lock keyword) which begin with an invocation of the following method.

Another option, which just came to me now, would be to obtain the SQL generated by SubSonic during an operation and perform string replacements to forward the requests to the appropriate database. This too is too much of a hack, however, since it depends on the implementation details and the DBMS.

In conclusion, I did manage to build a working prototype using locks and the above code, but I feel the code is too dirty and I am open to suggestions from SubSonic experts (I'm looking at you Rob Conery and Eric Kemp). If there is a clean way to do it, I would love to contribute it to the SubSonic project!

Read Part 2.

The Mysterious Parameter Is Not Valid Exception

clock November 29, 2007 20:39 by author JKealey

For a number of weeks, we had been encountering an odd exception on rare occasions. Typically, our point of sale would run flawlessly up until a very busy day where it would not want to render some of our cached images.

System.Web.HttpUnhandledException: Exception of type 'System.Web.HttpUnhandledException' was thrown. ---> System.ArgumentException: Parameter is not valid.
at System.Drawing.Image.Save(Stream stream, ImageCodecInfo encoder, EncoderParameters encoderParams)
at System.Drawing.Image.Save(Stream stream, ImageFormat format)

We investigated on Google for the possible cause of this error and found a bunch of irrelevant posts from people who get this error message every time they execute their code. In addition, we discovered that it was a generic message that could mean a null image, an inappropriate image format, etc. We figured it must have something to do with memory usage because of the time it took before it occurred in production. However, we knew that the ASP.NET worker process had not restarted because of excessive memory usage.

We ran stress tests on our machines, and never managed to replicate the error. In one session, we loaded all the birth certificates that a store had ever created, hundreds times more than what they would do on their busiest day. We were unfortunately unable to replicate this issue. (Here at LavaBlast, we mostly use NUnit and NUnitASP for our unit testing of ASP.NET applications).

Then we found a post saying that you might want to copy from the image, to a new MemoryStream instead of directly outputting to the Response.OutputStream of an ASP.NET application.

The relevant source code looks like this:

public static void CopyToStream(Bitmap image, Stream outputStream, ImageFormat format)
    using (MemoryStream stream = new MemoryStream())
        image.Save(stream, format);

The code is accessed this way in one of our ASP.NET handlers (ASHX file):

Bitmap image = null;
// load the image from cache
if (image != null)

HttpResponse response = context.Response;
response.ContentType = "image/jpg";

The end result is that this makes no difference at all. Bummer! Because we had to wait a week to get the results in production, we needed to replicate this error on our development machines. We quickly realized that we had overlooked the most obvious of solutions in the first place. We knew the image was not null and we knew it was still in the cache, but we had never checked to see if it was disposed! Yes, somehow the images in our cache were disposed by some external process, but not by the cache itself, which would have removed it from the cache beforehand. Once a System.Drawing.Image is disposed, all of its properties return the Parameter is not valid error. In our image cache, we coded a quick hack that would test the image.Height property was throwing this error: if it was, we reloaded the image from the database. (Note: Images do not have an IsDisposed property).

Obviously, this hack was not very reassuring. While Etienne took on the task of refactoring the image cache to store a byte array instead of a System.Drawing.Image, I took out the heavy artilery to find the root cause of this exception. By using JetBrains dotTrace 3.0, a superb tool for profiling .NET applications (Both WinForms and Web Applications), I discovered a memory leak in our application. I cannot overstress how glorious this tool is. It is simply excellent and it saved me tons of time.

In any case, before fixing the memory leak, I reduced the maximum memory IIS allows to my worker process to 16mb. (My machine has 4GB of RAM; that's why we never discovered the flaw in the first place. We should have tested on our sample production hardware instead … but that's another story). With such low memory, I was quickly able to cause the worker process to restart when trying to load too many images (all the images the store had ever produced, once again). Between worker process restarts, I managed to replicate the elusive Parameter is not valid exception. Debugging under this scenario with scare resources, I discovered that the image was being Dispose in the short lapse of time between its creation and its output, revealing that no amount of quick hacks would have solved this issue.

Returning to the memory leaks with JetBrains dotTrace, we found them quickly and the application managed to run our nasty stress test with 32mb assigned to the worker process.

In conclusion, there are no real miracle solutions for solving this problem except ensuring you don't use up too much memory! I just wanted to write this post to help people who are encountering intermittent "Parameter is not valid" exceptions figure out what is going on!

Shameless plug: LavaBlast creates industry-specific interactive kiosks integrated with tailor-made point of sale software, and a variety of other software solutions for franchisors.

An Improved SubSonic ManyManyList

clock November 28, 2007 13:38 by author JKealey

Etienne is on fire with his recent blog posts about SubSonic, so I thought I would contribute one too.

Five months ago I submitted a patch to SubSonic concerning their ManyManyList control (SubSonic.Controls.ManyManyList). I love the control as it is a real time saver, but there are a few limitations.

1 - Cannot use a view as a primary table or foreign table.
In my context, I want to display strings and these strings are not directly in the ForeignTable. The control had assumptions on the presence of a primary key.

2 - Cannot sort the resulting elements
In my context, I want to sort the strings alphabetically.

3 - Cannot filter the foreign table
In my context, a particular item can be in multiple categories, but the set of categories it can be in is not the full foreign table.

4 - The save mechanism deletes all rows and recreates them. If you have other columns in your map table, you lose all that information. Furthermore, there are no checks to see if the delete/recreation is necessary. Even if there are no changes, it still deletes/recreates everything.

I've pretty much re-written everything to support the listed behaviour. The parameter names should be reviewed because they are not very user friendly, and I am not well versed in the SubSonic naming conventions. Since then, we've used this code in production and it appears to work perfectly for our purposes (and it should work exactly as the other one did out of the box if you don't specify any of the new properties).

Agrinei enhanced my code to make it even more customizable.

Download the patch directly on CodePlex and don't forget to vote for the issue!

SubSonic object change tracking

clock November 28, 2007 08:44 by author EtienneT

Sometimes you want to know what your users are doing in your system. You probably don't want to spend too much time on this feature, but it is nice to be able to prove that a certain user introduced data inconsistencies and it isn't your fault. Obviously, the "client is always right" but tracking down the source of a problem (and pointing fingers) is something that is very useful to developers like us!

We decided to add events in our controllers to catch when an object is updated or inserted. When that occurs, we can produce simple HTML about the object through reflection. If the user changed an object, we can generate a nice view of what was changed, as well. Here is an example of a save to the database. We generate a string and insert it into our ElectronicJournal (audit log) to keep track of all the actions in the system:

Type: InventoryItem
Schema: SubSonic.TableSchema+Table
ItemGUID: d7fc5f85-129a-4909-b4ac-490fc26d9007
StoreID: TestStore1
StorePrice: [12.99] -> [7.99]
Tax1: True
Tax2: True
Tax3: True
Tax4: True
CreditPoints: [129] -> [79]
SalePoints: [1299] -> [799]
BonusPoints: 0
StockQuantity: 0
IsReportable: True
PromptForPrice: False
HideItemInStore: False
Item: 3.2
Store: Test Store 1
TableName: InventoryItems
NullExceptionMessage: {0} requires a value
InvalidTypeExceptionMessage: {0} is not a valid {1}
LengthExceptionMessage: {0} exceeds the maximum length of {1}

By tracking various actions in your database, you can make a quick listing to track a user's actions. If you add a few more columns in your audit log table, you can track who did the change, when they did it, their IP, etc. We keep track of a set of action types which allows us to filter by "Saved Objects" on a particular date, for a certain person (for example). This allows us to visualize what a user did in an efficient manner. Obviously this is a pretty simple example that is based on the existence of SubSonic objects and we don't handle complicated scenarios for the time being. If you run update queries on the database that change multiple objects in one save, we obviously never generated any SubSonic objects and cannot track those changes. However, most of the time, it is sufficient to have a good idea of what is going on. Again, we reap the benefits of our standard use of custom SubSonic controllers.

I have included a pretty simple utility class that you can use to do this. It's pretty generic, so you can easily use this with your own classes. The methods of interest in the class are:

public static string DumpObjectToHtml<T>(T before, T after)


public static string DumpObjectToHtml(object obj)


The class includes some other useful functions like:


public static C Paging<T, C>(C result, int startRowIndex, int maximumRows)
where T : ActiveRecord<T>, new()
where C : ActiveList<T, C>, new()


public static void Sort<T, C>(string sort, C result)
where T : ActiveRecord<T>, new()
where C : ActiveList<T, C>, new()


Those two methods enable you to page or sort a SubSonic collection if you can't use SQL Paging/Sorting. I had to use this to Page/Sort a SubSonic collection containing both database objects and my objects that I created after the database call. I hope this can be useful to someone.

SubSonicHelper.cs (3.57 kb)

kick it on

SubSonic magic

clock November 27, 2007 16:13 by author EtienneT

ASP.NET developers who don't know what SubSonic is should definitely go read about it. We have been using SubSonic since the initial 2.0 release and have built most of our recent data-driven applications using this wonderful tool. We really like SubSonic and think it has helped us tremendously in our daily productivity. For all simple SQL operations, we try to never write custom SQL and use the SubSonic Query object instead. What I want to talk about today is how we use SubSonic in our main project FranchiseBlast and how we think this could be useful for other programmers like us.

A bit of introduction is required; FranchiseBlast has a set of different web pages to manage our data (products, sale reports, pricing, web orders, etc.). Most of these pages use the typical master list / detailed view usage scenario, with the master list being filtered due to search criteria. Additionally, because FranchiseBlast includes fine-grained access control, all lists need to be filtered according to these permissions, which vary per user. Although there are many built-in constructs in ASP.NET (GridView, DetailsView, FormView, etc.) that support this scenario via an SqlDataSource, but let us explain why we prefer using SubSonic.

SubSonic Controllers

Most of the time, you don't want to rewrite the same code again and again. When doing a classic scenario of displaying a GridView in ASP.NET, you are faced with multiple things you have to handle. First, how do you do sorting? Do you do it in memory with a DataView, for example, or do you let SQL server do it? Writing the SQL to benefit from SQL Server sorting is more work, but if your table has a lot of data, then it becomes absolutely necessary.

What do you do for paging? Do you want to fetch ALL the rows from your database and then let the grid view handle the paging in memory? This works well until you have a table containing thousands of rows because your GridView only displays the first 15 rows and you're transferring the all that data from SQL Server to ASP.NET for nothing (that's if you didn't use caching, but that's of another subject).

Sorting and paging are only two problems. What about filtering with a search string? Filtering by user permissions? These are problems we wanted to solve once and for all at LavaBlast when creating our core data management engine. We did not want to copy paste complex SQL code everywhere to manage filtering, sorting, and paging, as this would cause a maintainability nightmare. We don't have millions of rows in our tables, but we have enough to cause perform issues if we don't think about scalability.

So what is the miracle solution? Here at LavaBlast, we think the solution is custom SubSonic controllers used with ASP.NET ObjectDataSource. We assume our readers already know how to use an ObjectDataSource and have a reasonable knowledge of SubSonic. So if you are not really familiar with ObjectDataSource, you should probably read this tutorial first: Displaying Data With the ObjectDataSource. And if you are not familiar with SubSonic, go to the web site and watch some SonicCasts or go read Rob Conery's blog (very interesting blog). Rob is one of the authors of SubSonic who just got hired by Microsoft to continue to work on the goodness of SubSonic! Nice work Rob!

Some code

Here is some sample source code that makes use of our custom Controller class. This class was manually authored and it adds a few methods to the Controller SubSonic generated for us by using the partial class mechanism available in C#. It returns a sorted list of item groups (a set of items) filtered by the user's permissions. Furthermore, it returns only the item groups to be displayed in the UI. We've extracted out the generic concepts of filtering, sorting, and paging. In the end, SubSonic generates a single SQL query that generates temporary tables in which the rows that are populated in the UI are displayed.

This controller is bound to an ObjectDataSource which feeds records into the GridView. FetchByProductAuthority has parameters to handle sorting, paging, and data filtering. "startRowIndex" and "maximumRows" are parameters used for paging. The "sort" parameter specifies the sort column name and the other three parameters are for filtering our data.

SubSonic connects to our SQL Server Database and uses code templates to generate ActiveRecord objects for all our tables, views, and stored procedures. Using strongly typed objects throughout our applications instead of ADO.NET DataRows greatly improves its maintainability for a negligible performance hit, since we only need a couple dozen objects to render one page of a GridView.

One of our ideas was to change the code generation templates to make all controllers derive from our base controller class. We made this class generic so that it could be used for all kinds of SubSonic objects and makes it easier to deal with object collections, also generated by SubSonic.

public abstract class SubSonicController<T, C>
where T : AbstractRecord<T>, new()
where C : AbstractList<T, C>, new()

We modified the code generation templates so that SubSonic controllers would extend our base controller:

public partial class ItemGroupController : SubSonicController<ItemGroup, ItemGroupCollection>

Since the class is partial, you can make a new file and continue to write methods for this controller in a completely separate file which won't be overwritten by SubSonic when you regenerate objects from the relational database. We could also have inherited from the generated controller to perform our extensions.

In any case, all our controllers have the same base class. This enables us to add functions to all our controllers. The most basic tasks you want to do with a controller is to do SQL sorting of your data and SQL paging. SubSonic can easily do the sorting and paging for you with a SubSonic Query object. So we decided to write custom function to adapt ObjectDataSource parameters to work with the SubSonic Query object. I include some code here to show how we do this:

protected static Query GetQueryByParams(string sort, int startRowIndex, int maximumRows)
Query q = CreateQuery();

q = AddPaging(q, startRowIndex, maximumRows);

q = AddSort(q, sort);
return q;

protected static Query AddSort(Query q, string sort)
if (!string.IsNullOrEmpty(sort))
if (sort.Contains("DESC"))
q.OrderBy = OrderBy.Desc(sort.Split(' ')[0]);
q.OrderBy = OrderBy.Asc(sort.Split(' ')[0]);

return q;

protected static Query AddPaging(Query q, int startRowIndex, int maximumRows)
if (maximumRows > 0)
q.PageIndex = (int)(startRowIndex / maximumRows) + 1;
q.PageSize = maximumRows;

return q;

Therefore, when we want to construct a basic query which handles sorting and paging for us, all we have to do is to write this code: Query q = GetQueryByParams(sort, startRowIndex, maximumRows);. In addition, for those who are not yet familiar with the ObjectDataSource, it automagically provides the sort, startRowIndex, maximumRows according to what is clicked in the GridView. In summary, we don't have much code to write to obtain data efficiently from our database that respects our business processes (access control, auditing, etc.).

Furthermore, we added a custom method to handle searching through a list of columns: see the SearchFields property above. We simply define a List<string> of column names in which we wish to search and our base controller handles adding the required filters to the SubSonic Query object. Finally, we also added default sorting, which allows us to simply set the DefaultSort property in our controller to be used when no sort column is specified in the Fetch methods.

I'll stop here; I think this article is already WAY too long. I hope this can be useful to someone! We're currently considering upgrading the controller templates to automatically include methods like FetchByProductAuthority which are commonly used in FranchiseBlast, depending on the SQL Schema of the table. We could even generate the sort/search fields by using metadata stored in the database. Furthermore, we're very interested in generating some ASCX/ASPX files, following the architecture imposed by our solution, which would make use of our controllers. These generated files would be good starting points to cut down on development time for some of our pages.

I include the SubSonicController class if anyone would be interested.

SubSonicController.cs (4.22 kb)

As for the code template for SubSonic, just modify it to use this base class and pass the right types in the generic parameters.

    public partial class <%=tbl.ClassName %>Controller : SubSonicController<<%=tbl.ClassName%>, <%=tbl.ClassName%>Collection>

If you have any questions or comments, don't hesitate to send them in.

Edit: It has come to my attention that I forgot to provide a disclaimer concerning the SearchFields in this post. SubSonic 2.0 does not fully support the OR query construct. Keep in mind that your query will not work properly if you specify more than one field name in the SearchField list PLUS you also use the AND query construct.  (SubSonic uses boolean operator precedence, and not parenthesis.


kick it on

Month List


The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

© Copyright 2017

Sign in