LavaBlast Software Blog

Help your franchise business get to the next level.
AddThis Feed Button

Gotcha: String or binary data would be truncated.

clock March 18, 2014 11:42 by author jkealey

I was playing in SQL Server this morning, trying to fix an odd bug. Took me a while to find it and I thought I’d share this tidbit with you.

Here’s an overly simplistic representation of what was causing the “String or binary data would be truncated.” error message:

   1:  declare @where nvarchar(max);
   2:  -- assume something put a large string in @where (over 8000 characters). 
   3:  declare @sql nvarchar(max);
   4:  select @sql = replace(‘select * from table1 where {0} order by column1 asc’, ‘{0}’, @where);
   5:  exec (@sql);


The reason I experienced this error is because of how replace handles nvarchar(max). By explicitly casting the first parameter to nvarchar(max), the error is resolved.

   1:  declare @where nvarchar(max);
   2:  -- assume something put a large string in @where (over 8000 characters). 
   3:  declare @sql nvarchar(max);
   4:  select @sql = replace(cast(‘select * from table1 where {0} order by column1 asc’ as nvarchar(max)), ‘{0}’, @where);
   5:  exec (@sql);


From the documentation:

If string_expression is not of type varchar(max) or nvarchar(max), REPLACE truncates the return value at 8,000 bytes. To return values greater than 8,000 bytes, string_expression must be explicitly cast to a large-value data type.

ASP.NET translation tools & gotchas

clock March 8, 2013 11:35 by author JKealey

We’ve recently translated one of our applications and thought we’d share the tools & techniques we used. In particular, every time we perform some ASP.NET translation, we hit a few gotchas. We kept facing the same problems every time we worked on translation, so we figured we might as well write everything down in a post for everyone’s benefit.


Because we’re translating an ASP.NET WebForms application, the main process is to open an *.aspx or *.ascx, switch to Design view, and perform Tools –> Generate Local Resource. This generates a *.resx file and adds the relevant markup in your source file. Tools are available to perform the actual translation and create the *.resx files in other languages.

The core issue here is that you need to perform this operation for each individual file. Potentially thousands of times and/or until you go crazy. (Personally, I find it frustrating that bulk resource generation is not a core VS.NET feature. )


Step 1 - Bulk Generate Local Resource

Instead of wasting our time opening each individual file, we found a macro on this forum. The macro does not run in VS.NET 2012, so we loaded up our old VS.NET 2010 and ran it from there.

The macro wasn’t flawless – it sometimes randomly crashed after processing files for half an hour. Deleting Visual Studio’s *.suo file and restarting it seemed to help.


Step 2 - Realize that VS.NET corrupted your files

I assume one of the reasons bulk resource generation is not a core VS.NET feature is because the feature is (in addition to being slow) partially broken.

Gotcha: Inline scripts/comments are sometimes deleted.

At a high level, any script blocks in your *.aspx/*.ascx files are vulnerable to deletion. Generate Local Resource will simply strip them out if they are contained in an <asp:UpdatePanel …/>.   We filed a bug on Microsoft Connect which was not deemed important enough to be fixed.

This is appalling because it will introduce pernicious bugs in your application that only show up at runtime, if you don’t pay close attention to each and every individual file.

For example:

<script>alert('<%="Some Constant" %>');</script>
<script>alert('<%= btnSomething.ClientID %>');</script>
<%-- <asp:Button runat="server" id="btn" Text="Some button that I may need to re-enable later"/> --%>
if (Request.QueryString["test"]=="bye") 
    Response.Write("Goodbye World"); 
    Response.Write("Hello World"); 

Would become the following, after Generate Local Resource, because everything



Admittedly, some of the inline code above is bad practice.  However, the silent deletion causes needle-in-a-haystack type bugs at runtime.

We decided to remove all of our inline code blocks from our code to avoid having issues during local resource generation.

Gotcha: culture=”auto” and uiculture=”auto” is added to all *.aspx files

These values, added in the *.aspx header, force the page to change culture based on the browser’s settings. In our application, this was not desirable as it by-passed the logic defined in our Global.asax file. (Our users can change their language via the web applications itself, not via their web browser settings.)

For more information, see this post by Rick Strahl.

Gotcha: Nested controls can be problematic

When trying to localize a LinkButton containing an Image and literal, the Image will be dropped.

<asp:LinkButton ID="lnkHello" runat="server" OnClick="lnkHello_Click">
    <asp:Image ID="imgEdit" runat="server" ImageUrl="~/images/icons/edit.gif"></asp:Image>


<asp:LinkButton ID="lnkHello" runat="server" OnClick="lnkHello_Click" meta:resourcekey="abcdef">    


To solve this issue, the nested controls must be separated.

Gotcha: Ajax:Accordion breaks during Generate Local Resource

If you are using <ajax:Accordion ../> from the ASP.NET Ajax control toolkit, be aware that it will be corrupted after generating *.resx files. The fix is simple: delete the erroneously added Accordion Extender.


Step 3 – Extract other hardcoded strings.

Your *.aspx/*.ascx files and your *.cs files may contain additional strings which must be extracted. Back in 2008, we had create a Macro to help with this process but in this iteration, we simply used JetBrains ReSharper. The VS.NET plugin made it easy to find strings which had not been extracted, and push them into *.resx files.  ReSharper is jam-packed with other useful features, but we’ve found that it does have a significant impact on performance in our solution.


Step 4 – Perform the actual translation

Back in 2008, we released a web application to help translate RESX files. We’re no longer using this application – there are better options out there. We picked Zeta Resource Editor and it worked nicely.


The tools available today are much better than they were five years ago, but one piece of the puzzle (Generate Local Resource) is still far from perfect. We’d love to see an improved version (in either VS.NET or ReSharper) which would:

  • Not delete inline code or comments inside UpdatePanels
  • Would be configurable (insert culture=”auto” everywhere? etc.)
  • Would produce reviewable reports of any changes which are not additions of meta:resourcekey to controls. (Performing a diff with regular tools is very time consuming given the thousands of changes.)
  • Could be executed in batch in a timely manner across a project

PS: Big thanks to @plgelinas for his research efforts for this project.

jQuery plugin to postback an ASP.NET button

clock August 20, 2012 10:52 by author EtienneT

We use jQuery a lot here at LavaBlast, but we also use ASP.NET webforms. We needed a simple reusable way to cause a postback on an managed Button or LinkButton.

Here is how it would be used for <asp:Button ID=”btShow” runat=”server” OnClick=”DoSomething” />

// Cause btShow to postback to the server

If you are not too familiar with jQuery, the selector [id$=”btShow”] search for any control with an id which ends with “btShow”.

Since ASP.NET 4.0, you could also use the new ClientIDMode=”Static” property on the server control to be able to have a static ID on the client and use a jQuery selector like this: $(‘#btShow’), but this is the matter of another discussion completely.

The postback() method is a jQuery plugin which I include here:

(function ($)
        postback: function ()
            return this.each(function ()
                if (this && "undefined" != typeof
                else if (this && this.tagName.toLowerCase() == "a" && this.href.indexOf('javascript:') == 0)
                    eval(this.href.toString().replace('javascript:', ''));

Feel free to use this and let us know if you find any problems with the code.

Style ASP.NET Web Forms Validators with qTip 2

clock August 13, 2012 08:20 by author EtienneT

View demo | Download source

The default validators inside ASP.NET Web Forms are quite uninteresting and require some styling work to look adequate.  Recently, we’ve been using the qTip2 jQuery library and we love it.  qTip enables you to add visually pleasant tooltips to any element.  For example, you simply add a “title” attribute to any element and then apply qTip to this element and the “title” attribute will be used as the tooltip’s text.  This is the simplest use case.  Here’s an example; with our FranchiseBlast registration form.


When you try to submit this form and the validation doesn’t pass, we replaced the default ASP.NET validators with styled qTip tooltips beside each validated element.


Like you can see, the validators have absolute positioning, which enables them to flow outside of the bounds of the registration panel.  We could also easily change the position of the bubble in relation to the validated element and also change the bubble tip position.

Let’s take a look at what was needed to accomplish this, using a simple ASP.NET project. Here is the main ASP.NET code for the ASPX page.  Nothing fancy: a simple form with some validators:


<asp:ScriptManager ID="p" runat="server">
        <asp:ScriptReference Path="" />
        <asp:ScriptReference Path="~/Scripts/qtip/jquery.qtip.min.js" />
        <asp:ScriptReference Path="~/Scripts/validators.js" />
<fieldset class="Validate" style="width: 300px">
    <legend>Tell us about yourself</legend>
        <span class="label">Business Name:</span>
        <asp:TextBox ID="txtBusinessName" runat="server" />
        <asp:RequiredFieldValidator ID="rfvBusinessName" runat="server" ControlToValidate="txtBusinessName" Text="Your business name is required" SetFocusOnError="true" EnableClientScript="true" />
    <div class="alternate">
        <span class="label">Your Name:</span>
        <asp:TextBox ID="txtYourName" runat="server" />
        <asp:RequiredFieldValidator ID="rfvName" runat="server" ControlToValidate="txtYourName" Text="Your name is required" SetFocusOnError="true" EnableClientScript="true" />
        <span class="label">Your Email:</span>
        <asp:TextBox runat="server" ID="txtEmail" />
        <asp:RequiredFieldValidator ID="rfvEmail" runat="server" ControlToValidate="txtEmail" Text="Email is required" SetFocusOnError="true" EnableClientScript="true" />
        <asp:RegularExpressionValidator runat="server" ID="revEmail" Text="Invalid Email" ControlToValidate="txtEmail" SetFocusOnError="true" ValidationExpression="^([0-9a-zA-Z]([-.\w]*[0-9a-zA-Z])*@(([0-9a-zA-Z])+([-\w]*[0-9a-zA-Z])*\.)+[a-zA-Z]{2,9})$" EnableClientScript="true" />
<asp:Button runat="server" ID="btnCreateAccount" CssClass="Next" Text="Create Account" />

Starting from the top, we need jQuery and also qTip to be added to our page.  The interesting JavaScript code in located in ~/Scripts/validators.js.  The rest of the code here is a simple ASP.NET form.  One important thing is that each element to be validated is enclosed in a <div> with his corresponding validators.  This is important because we will use this convention later in our script to find the associated validators for an input control.

I also have to mention that I added some lines in the .skin file of the App_Theme:

<asp:RequiredFieldValidator runat="server" CssClass="ErrorMsg" Display="Dynamic" />
<asp:CustomValidator runat="server" CssClass="ErrorMsg" Display="Dynamic" />
<asp:RangeValidator runat="server" CssClass="ErrorMsg" Display="Dynamic" />
<asp:CompareValidator runat="server" CssClass="ErrorMsg" Display="Dynamic" />
<asp:RegularExpressionValidator runat="server" CssClass="ErrorMsg" Display="Dynamic" />

This will force CssClass=”ErrorMsg” on validators.  This will be used next in our JavaScript code to find the validators:


Sys.WebForms.PageRequestManager.getInstance().add_pageLoaded(function () {
    function getValidator() {
        return $(this).parent().find('.ErrorMsg').filter(function () { return $(this).css('display') != 'none'; });
    var inputs = '.Validate input:text, .Validate select, .Validate input:password';
    var submit = $('input:submit');
    var q = $(inputs).qtip({
        position: {
            my: 'center left',
            at: 'center right'
        content: {
            text: function (api) {
        show: {
            ready: true,
            event: 'none'
        hide: {
            event: 'none'
        style: {
            classes: 'ui-tooltip-red ui-tooltip-shadow ui-tooltip-rounded'
        events: {
            show: function (event, api) {
                var $this =;
                var validator =$this);
                if (validator.length == 0)
    if (window.Page_ClientValidate != undefined) {
        function afterValidate() {
            $(inputs).each(function () {
                var validator =;
                if (validator.length > 0) {
                    var text = validator.html();
                    $(this).addClass('Error').qtip('show').qtip('option', 'content.text', text);
//                    validator.hide();
        var oldValidate = Page_ClientValidate;
        Page_ClientValidate = function (group) {

There is much to explain in this code.  First we register a new function to be executed each time there’s an ASP.NET PostBack on the page here: Sys.WebForms.PageRequestManager.getInstance().add_pageLoaded(function () { … });

The function getValidator finds the visible ASP.NET validators associated to a control to be validated.  We use the fact that the control to validate and the validators are contained inside a <div>.

We apply qTip to the inputs to validate and we get the text of the message by finding the visible validators.  Also we have some logic to prevent showing the qTip element if there aren’t any visible validators.

We also do some monkey patching at the end where we inject our own code inside the Page_ClientValidate ASP.NET JavaScript method.  To do that, we simply get a reference to the Page_ClientValidate function, create a new function with our additional code (calling the old Page_ClientValidate) plus we override window.Page_ClientValidate with our new function.  This new function have both the new and old functionality.

You would probably have to modify this code a little bit to fit your needs, but this shows how you could integrate qTip2 for nicer validators in ASP.NET Web Forms.

View demo | Download source

Microsoft Excel on Multi-Monitor Machines

clock June 5, 2012 11:51 by author jkealey

All of the developers at LavaBlast use three monitors; utilizing multiple monitors has significantly increased our efficiency. However, Microsoft Excel doesn’t work particularly well in a multi-monitor setup. By default, every time you open a new Excel file, its contents are displayed within the same instance. You have to manually launch other instances of Excel to have one instance per monitor, which is time consuming.

It is possible to configure Microsoft Excel to load one Window per file, but it involves a number of obscure configuration settings & registry changes. Every time we move to a new machine, this configuration needs to be redone. The information is spread out on a number of sites/forums and it takes a while to re-discover the sources. his post aims at centralizing this information.

In particular, this post focuses on Microsoft Excel 2010 on Windows 7 64-bit. I believe the fix works on other versions as well; feel free to comment on this blog post if the steps are different.

Step 1) Force Excel To Open Multiple Windows

Excel 2010:

  • File –> Options –> Advanced –> Scroll down into the “General” section –> Check the “Ignore other applications that use Dynamic Data Exchange (DDE)” checkbox image

Excel 2007:

  • Office Icon in the top left corner of Excel –> Excel Options –> Advanced  -> Scroll down into the “General” section –> Check the “Ignore other applications that use Dynamic Data Exchange (DDE)” checkbox

Once this change is done, every time you double click on an Excel file in Windows Explorer, a new instance of Excel will open. However, you’ll probably encounter the following error.

Step 2) Fixing “There was a problem sending the command to the program”

Each Excel file you open from Windows Explorer now launches in its own separate window. However, Excel spits out “There was a problem sending the command to the program” and leaves the Excel window blank.  You can drag & drop your existing file to this window to open it, but this is still painful. We will need to change the system registry to solve this issue; please refrain from doing this is you are not comfortable with the reg edit tool.

  1. Launch regedit
  2. Rename the HKEY_CLASSES_ROOT\Excel.Sheet.8\shell\Open\ddeexec  key to HKEY_CLASSES_ROOT\Excel.Sheet.8\shell\Open\ddeexec.bak
  3. Edit HKEY_CLASSES_ROOT\Excel.Sheet.8\shell\Open\command\(Default).  Change /dde to “%1” in the value.
  4. As an example, mine was from "C:\Program Files (x86)\Microsoft Office\Office14\EXCEL.EXE" /dde to "C:\Program Files (x86)\Microsoft Office\Office14\EXCEL.EXE" "%1"
  5. Edit HKEY_CLASSES_ROOT\Excel.Sheet.8\shell\Open\command\command. Change /dde to “%1” in the value.
  6. As an example, mine was from ykG^V5!!!!!!!!!MKKSkEXCELFiles>VijqBof(Y8'w!FId1gLQ /dde to ykG^V5!!!!!!!!!MKKSkEXCELFiles>VijqBof(Y8'w!FId1gLQ "%1"
  7. Rename the HKEY_CLASSES_ROOT\Excel.Sheet.12\shell\Open\ddeexec key to HKEY_CLASSES_ROOT\Excel.Sheet.12\shell\Open\ddeexec.bak
  8. Edit HKEY_CLASSES_ROOT\Excel.Sheet.12\shell\Open\command\(Default). Change /dde to “%1” in the value.
  9. Edit HKEY_CLASSES_ROOT\Excel.Sheet.12\shell\Open\command\command. Change /dde to “%1” in the value.


Excel should now load separate Windows for each file you open. This setup will consume more memory, but will vastly increase your productivity.

Troubleshooting note:

  • Ensure you used “%1” with the surrounding quotes (not this: %1) in the above registry changes. Otherwise, you will get an error message: “’{file}’ could not be found. Check the spelling of the file name, and verify that the file location is correct.”

Thanks to Turbo2001rt  for the final important tweaks.

      LavaBlast POS v4.0.0

      clock September 6, 2011 13:49 by author jkealey

      We’re just about to release the version 4.0.0 of our franchise point of sale system. One of the most noteworthy change is the fact we’ve given the look & feel a major overhaul, thanks to jQuery Mobile which we’ve blogged about previously. We thought we’d take a minute to share with you what makes it so special!

      First off, I’ve recorded a short video featuring a variation of our franchise POS for the Teddy Mountain franchise. Teddy Mountain provides the stuff your own teddy bear experience to children worldwide and have been using our POS since 2006.


      As you’ll see, I focus on a few of our differentiators in the point of sale space. We’re not a point of sale company and our POS is not conventional: we’re a franchise software company and we’ve created the best point of sale system for a franchise environment.

      We bake in a franchise’s unique business processes into the point of sale, making it very powerful while still extremely easy to use. By integrating our point of sale with FranchiseBlast, we’ve also eliminated dozens of standardization/uniformity issues which face small retail chains or franchises.

      Furthermore, we’ve given additional focus to cross-browser compatibility in this release as our POS is not only used regular POS hardware (in brick & mortar stores) but also on the Apple iPad for back office operations an for managing the warehouses that feed our franchise e-commerce websites.  We’re definitely excited by the potential tablets have for both retail and service-based franchises! Expect more news from us in this space soon!

      In the meantime, if you know of small chains / new franchises which want to explore disruptive technologies in their locations, we hope you’ll point them in our direction!

      Gotcha: Reporting Services Viewer bugs on Google Chrome

      clock June 28, 2011 11:09 by author jkealey

      We include the ASP.NET ReportViewer which comes with Microsoft SQL Reporting Services inside some of our applications. Simply put, it generates a web-based version of the report and can easily be integrated within a website. However, the ReportViewer has been plagued with numerous cross-browser compatibility bugs over the years. Some have been fixed, while others remain. Recently, we’ve had the following issues:

      • On Google Chrome, each button in the toolbar takes a separate line. You thus end up with 5 toolbars instead of one, taking up all the space.
      • On Google Chrome, the width & height were slightly off (50 to 100 pixels), causing scrollbars to appear.
        A quick search revealed some sample code for similar issues, but none of them fully resolved our issues. Mainly, we require AsyncRendering=”true” and most of the fixes didn’t work in this context. Here’s what we ended up rolling with (uses jQuery and Microsoft AJAX).
           1:  // container is either the ReportViewer control itself, or a div containing it. 
           2:  function fixReportingServices(container) {
           3:      if ($.browser.safari) { // toolbars appeared on separate lines. 
           4:          $('#' + container + ' table').each(function (i, item) {
           5:              if ($(item).attr('id') && $(item).attr('id').match(/fixedTable$/) != null)
           6:                  $(item).css('display', 'table');
           7:              else
           8:                  $(item).css('display', 'inline-block');
           9:          });
          10:      }
          11:  }
          12:  // needed when AsyncEnabled=true. 
          13:  Sys.WebForms.PageRequestManager.getInstance().add_pageLoaded(function () { fixReportingServices('rpt-container'); });
          14:  /*$(document).ready(function () { fixReportingServices('rpt-container');});*/
        example .aspx
         1:  <div style="background-color: White; width: 950px" id="rpt-container">
         2:      <rsweb:ReportViewer ID="ReportViewer1" runat="server" Font-Names="Times New Roman"
         3:          Font-Size="8pt" Height="700px" Width="950px" ShowExportControls="true" ShowPrintButton="false" 
         4:          ShowRefreshButton="false" ShowZoomControl="false" SkinID="" AsyncRendering="true"
         5:          ShowBackButton="false">
         6:          <LocalReport ReportPath="contract.rdlc"
         7:              DisplayName="Contract">
         8:          </LocalReport>
         9:      </rsweb:ReportViewer>
        10:  </div>
        11:  <asp:ScriptManagerProxy ID="proxy" runat="server">
        12:      <Scripts>
        13:          <asp:ScriptReference Path="~/js/fixReportViewer.js" />
        14:      </Scripts>
        15:  </asp:ScriptManagerProxy>

      The fixes we found on other websites (setting the display to inline-block on the included tables) only worked for the first load – as soon as the report changed due to AsyncRendering=”true”, the toolbars were broken again. This was fixed by replacing jQuery’s ready function with Microsoft ASP.NET Ajax’s PageLoaded function.

      We also noticed that these fixes also broke down our width & height. We pinpointed the issue to the generated HTML table with the id ending with fixedTable, which needed to be left as display table instead of inline-block. We thus adapted the JavaScript.

      The HTML wraps the ReportViewer with a div, mostly for convenience (to avoid peppering our code with <%= ReportViewer1.ClientID %>). Furthermore, if my memory serves me well, we set the background-color manually because some browsers made the ReportViewer transparent.

      Hope this helps! If you find more elegant ways of doing this, or know of more gotchas, please let us know!

      Using Microsoft POS for .NET in 2011

      clock June 6, 2011 08:41 by author jkealey

      Five years ago, we decided to utilize Microsoft’s Point Of Service for .NET (POS for .NET) in our point of sale (POS) to integrate with the various peripherals used by POS systems. Simply put, POS for .NET enables developers to utilize receipt printers, cash drawers, barcode scanners, magnetic stripe readers (MSR), line displays (and many other peripherals) within their .NET applications. Back then, the .NET framework was at version 2.0. Obviously, many things have changed since then with the advent of .NET 3.0, 3.5 and, more recently, 4.0. However, the latest version of POS for .NET’s is v1.12 and it was released in 2008.

      Being forward-thinking as we are, we structured our point of sale as a web application from day one, to enable future deployment scenarios (being browser-based means we can easily use our point of sale on the iPad or any other hot hardware platform) and code-reuse within our e-commerce application and FranchiseBlast. However, this made it a bit harder on us to integrate with the peripherals as we weren’t using them in the traditional context of a desktop application (especially access Windows printers from a server-side web application). However, we solved those issues many years ago and have continued to evolve the solution ever since.

      Fast forward to 2011: POS for .NET has not been refreshed in three years, we’ve moved to 64-bit machines and .NET 4.0. This blog post is a collection of tips & tricks for issues commonly faced by .NET developers working with POS for .NET in 2011.

      Common Control Objects – don’t forget about them!

      This is just a reminder, as this was true back in 2006 too. You’d typically expect to be able to install the peripheral’s driver and then utilize it within your .NET application. However, you also need to install intermediary Common Control Objects.  I always end up downloading the CCOs from here.  I always forget the proper order and sometimes run into trouble because of this and end up having to uninstall and reinstall half a dozen times until it works (… pleasant…). I believe this is the installation order I use (you may need to reboot between each step).

      1. Install Epson OPOS ADK
      2. Install other drivers (scanners, etc.)
      3. Install the Common Control Objects
      4. Define logical device names (LDN) using Epson OPOS
      5. Install POS for .NET 


      POS for .NET doesn’t work in 64-bit

      Long story short, due to the legacy hardware it supports, POS for .NET only works in 32-bit. If you’re running an app on a 64-bit machine, it will fail with a cryptic error message or will simply be unable to find your peripherals. Example:

      System.Runtime.InteropServices.COMException (0x80040154): Retrieving the COM class factory for component with CLSID {CCB90102-B81E-11D2-AB74-0040054C3719} failed due to the following error: 80040154 Class not registered (Exception from HRESULT: 0x80040154 (REGDB_E_CLASSNOTREG)).

      You can still use the peripherals on 64-bit operating systems, but you will need to compile your desktop application as 32-bit (Right click on your project –> Build –> Platform target: x86). You even need to do this with the example application that comes with POS for .NET (in C:\Program Files (x86)\Microsoft Point Of Service\SDK\Samples\Sample Application) if you want to use it.

      You’ll probably run into the same issues with all the .NET test applications supplied by the device manufacturers. Unless you can manage to find an updated sample, you’ll have to work your magic with a decompiler. In addition to probably being illegal, it is a pain and a half. Therefore, you’re better off using the test application that comes with POS for .NET.

      As for web applications, you need to force IIS to run your application in a 32-bit application pool.

      POS for .NET doesn’t work in .NET 4.0

      Another bad surprise is migrating your application to .NET 4.0 and then realizing the POS hardware stops working. Long story short, you’ll get this error:

      This method explicitly uses CAS policy, which has been obsoleted by the .NET Framework. In order to enable CAS policy for compatibility reasons, please use the NetFx40_LegacySecurityPolicy configuration switch. Please see

      The error message is fairly self-explanatory. Microsoft stopped supporting '”Code Access Security”, which is internally used by POS for .NET. You can either turn on a configuration option that re-enables the legacy CAS model or wait for Microsoft to release a new version of POS for .NET.  We’ve been told not to hold our breath, so the configuration option is the preferred flag. 

      If you’re creating a desktop application, the solution is in the error message – more details here.  Add this to your app.config:

            <NetFx40_LegacySecurityPolicy enabled="true"/>


      If you’re creating a web application, the flag is a bit different. Add this to your web.config:

            <trust legacyCasModel="true"/>

      POS for .NET doesn’t work with ASP.NET MVC / dynamic data/operations

      The above flag will cause your legacy code to run properly on .NET 4.0 but it does have a side-effect. You will not be able to use some of the newer .NET framework features such as the dynamic keyword. Not only can you not use it explicitly within your own code, but ASP.NET MVC 3 uses it internally within the ViewBag.

      Dynamic operations can only be performed in homogenous AppDomain.

      Thus, you have to choose between POS for .NET or ASP.NET MVC 3, unless you load up your POS objects in another AppDomain. Here’s some sample code to help you do that.

      You need to be able to create another AppDomain and specify that this AppDomain should use the NetFx40_LegacySecurityPolicy option, even if your current AppDomain doesn’t have this flag enabled.

         1:  var curr = AppDomain.CurrentDomain.SetupInformation;
         2:  var info = new AppDomainSetup()
         3:  {
         4:      ApplicationBase = curr.ApplicationBase,
         5:      LoaderOptimization = curr.LoaderOptimization,
         6:      ConfigurationFile = curr.ConfigurationFile,
         7:  };
         8:  info.SetCompatibilitySwitches(new[] { "NetFx40_LegacySecurityPolicy" });
        10:  return AppDomain.CreateDomain("POS Hardware AppDomain", null , info);


      You can then use this AppDomain to create your POS peripherals. All our peripherals extend our own custom PosHardware base class with a few standard methods such as FindAndOpenDevice(), so we use the following code. For testing purposes, we created a configuration option (IsHardwareLibInSameAppDomain) to toggle between loading in the current AppDomain versus a separate one.

         1:  private T Build<T>(string id) where T : PosHardware, new()
         2:  {
         3:      T hardware = null;
         4:      if (IsHardwareLibInSameAppDomain)
         5:          hardware = new T();
         6:      else
         7:          hardware = (T)OtherAppDomain.CreateInstanceFromAndUnwrap(Assembly.GetAssembly(typeof(T)).Location, typeof(T).FullName);
         9:      if (!string.IsNullOrEmpty(id))
        10:          hardware.DeviceName = id;
        11:      hardware.FindAndOpenDevice();
        12:      return hardware;
        13:  }


      Also, don’t forget to mark your classes as Serializable and MarshalByRefObject.

         1:  [Serializable]
         2:  public abstract class PosHardware : MarshalByRefObject


      Working with objects in other AppDomains is a pain.  Any object that you pass between the two app domains (such as parameters to functions or return values) must be marked as Serializable and extend MarshalByRefObject if you wish to avoid surprises.  If you marshal by value instead, you will be working on read-only copies of (which may or may not be desirable, depending on your context.)


      It only took three years without a new release before POS for .NET started being a pain to work with – unless you stick with past technologies. With the advice provided here, however, you should be able to move forward without issue. Did you discover any other gotchas with POS for .NET?

      Gotcha: iPad versus ASP.NET

      clock May 29, 2011 14:19 by author JKealey

      Your web app looks awesome on the iPad, until…

      FranchiseBlast: Franchise Intranet on iPad You decide to save it to your home screen.

      If you’re doing this with a web application you’ve developed, you probably want to make it appear a bit more like a native app,  so you’ll add two meta tags to make the experience nicer (add an app icon and remove the navigation bar). Remember: Safari caches these tags when creating the shortcut, so you will need to delete/recreate the shortcut to force it to refresh.

      Everything will look fine, until you reload the web app on some other occasion: ASP.NET Ajax is now completely broken and many of your styles are missing. Simply put, an application that worked fine when you shut down your iPad minutes ago will be completely unusable. No amount of refreshing will solve the issue. Clearing Safari’s cache and using it outside of the home screen icon is the only workaround.

      Gotcha: Safari uses different HTTP User-Agent strings depending on context!

      The iPad (and iPhone/iPod Touch) don’t use the same HTTP User-Agent string when a website is accessed normally via the Safari application versus a webpage that was saved to the home screen (which still uses Safar internally). Here’s an example:

      • Normal Safari: Mozilla/5.0 (iPad; U; CPU OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2
      • Website as app: Mozilla/5.0 (iPad; U; CPU OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Mobile/8J2
        ASP.NET doesn’t recognize the latter as being Safari – it recognizes it as a generic browser with no capabilities. As an example: Supports JavaScript? Nope.
        Add this hack to your base page to fix it. It makes ASP.NET think that anything based on AppleWebkit is a newer browser – a much better default setting (for most of us) than assuming the browser was created back in 1994.
           1:     protected void Page_PreInit(object sender, EventArgs e)
           2:      {
           3:          if (Request.UserAgent != null && Request.UserAgent.IndexOf("AppleWebKit", StringComparison.CurrentCultureIgnoreCase) > -1)
           4:          {
           5:              this.ClientTarget = "uplevel";
           6:          }
           7:      }

      The long answer

      On and off over the course of a month, I spent dozens of hours investigating this issue. In the end, the fix is trivial and updated *.browser files will probably be produced by Microsoft over the next months/years. However, I did learn a lot about how to debug a web-app on the iPad in the process, and thought I’d share a few lessons learned. I’ve kept them purposefully brief as you can easily find detailed answers on Google or StackOverflow.

      Enable Safari’s Developer Debug Console

      That’s how I figured out ASP.NET AJAX wasn’t loading properly. Was getting JavaScript errors that you normally get when ASP.NET Ajax is not properly configured in the web.config file.

      • On the iPad: Settings –> Safari –> Developer –> Debug Console (on)

      Install Firebug Lite on your iPad

      I frequently use Firebug on my desktop computer to analyse web pages. The lite version on the iPad helped me review the HTML/JavaScript in greater detail. Install this bookmarklet (found it painful to do) then install the FireBug Lite bookmarklet. More info here.

      Setup an HTTP Proxy

      This helped me get in-depth information about exactly what HTML/JavaScript was being served in response to which HTTP Headers. That’s how I realized certain scripts were not being included when the User-Agent changed.

      • As I develop on a Windows machine, I made it run through Fiddler on my desktop. Other options found here.
      • In Fiddler, Tools > Fiddler Options > Connections –> Allow Remote Computers To Connect. Restart Fiddler. (Not working? Check Windows Firewall.)
      • On the iPad: Settings –> Wi-Fi –> Click on the right arrow for your connection –> HTTP Proxy –> Manual –> Set Server and Port. 

      Other limitations worth knowing about

      As everyone knows, Flash doesn’t work on the iPad/iPhone/iPod. It doesn’t come as a surprise to use that we have to eliminate it from our app (not a big detail for us).

      However, one gotcha that had not come to mind initially is that certain JavaScript functionality such as click&drag or drag&drop does not work on the iPad given the differences in gestures between a conventional computer and a tablet. That code needs to be rewritten.

      Did you experience any issues you’d like to share?

      Re-skin your existing web app using jQuery Mobile

      clock May 24, 2011 15:44 by author EtienneT

      IMG_3337Recently, for a new franchise client of ours, we had to add some new features to our web-based point of sale system.   This piece of software makes extensive use of touch screen functionality; we need to think about this when we design our UI.  The interface must be optimized to allow cashiers to perform their operations effectively and intuitively with a touch screen.  Our application isn’t traditionally used via a multi-touch interface like the iPhone or iPad; rather, operators use a traditional touch screen. In this project, we had to adapt existing pages and create completely new modules.

      In the past, we had played with jQuery Mobile and we were really impressed.  Take a look at the demo site – all you need to do is including a reference to jQuery Mobile’s JavaScript and follow some conventions with your HTML to get a nice mobile-friendly user interface. However, when you think about it, mobile-friendly (touch friendly) user interfaces are also very appropriate for traditional touch screen technologies utilized by the retail industry for over a decade.

      Given our point of sale was initially created to be compatible with IE6 back in the day, we felt it was time to enhance the look and feel utilizing new technologies. We liked jQuery Mobile’s look & feel and wanted to utilize some bits & pieces, without having to re-implement everything following their conventions.  When you peek at the jQuery Mobile source code on GitHub, you realize that each component is separated in it’s own little jQuery plugin.  Some plugins are not completely independent from the jQM (jQuery Mobile) page, but you can use most of them outside of that context; you can thus use them in your own applications without a jQM page that takes 100% of the screen real estate.  In a typical jQM scenario, you define a page and then jQM works it’s magic: it initializes all the controls for you - if you followed the conventions. In this post, we’ll cover using jQuery Mobile outside of the jQM page and mobile context.

      How to trick jQuery Mobile

      First, you simply have to trick jQM into thinking that there’s a jQM page in the HTML.  To do that, you have to bind to a special jQM event, mobileinit.  This is event is executed before any jQM code modifies your html:

      $(document).bind("mobileinit", function ()
          if ($('div[data-role=page]').length === 0)
              $('<div data-role="page" id="pageTemp" style="display:none;"></div>').appendTo($('body'));
          $.mobile.ajaxEnabled = false;
          $('#pageTemp').live('pagebeforecreate', function (event)
              return false;

      Here you insert a jQM page in your html if one hasn’t already been defined. This is required for some controls to work (like the dropdown list, if I remember correctly). We then disable AJAX page fetching and also disable the 'pagebeforecreate' event for the newly created dummy page.  Most of our pages utilize this placeholder (as we only use the UI controls), but we did – on a few occasions - utilize the standard jQM page in all its glory inside our point of sale. 

      If there’s a better way to do this, please let us know - with the current version of jQuery Mobile (1.0a4.1) it appears to be working pretty well.

      Interesting controls

      The plugins we found most interesting were the one dealing with forms controls. You can see a gallery of all forms elements in jQuery Mobile here.

      For example, some of the base HTML controls are not ideal in the context of a touch-enabled application. Take radio buttons, for example. They are way too small and hard to click on accurately - you have to manually change their styling via CSS to make them easy to touch.  Here is the jQM version of radio button list:

      Radio Button List


      To create this, first you need a little bit HTML plus the checkboxradio plugin from jQuery Mobile:

      <fieldset data-role="controlgroup">
          <input type="radio" name="radio-choice-1" id="radio-choice-1" value="Cat">
          <input type="radio" name="radio-choice-2" id="radio-choice-2" value="Dog">
          <input type="radio" name="radio-choice-3" id="radio-choice-3" value="Hamster" checked="checked">
          <input type="radio" name="radio-choice-4" id="radio-choice-4" value="Lizard">

      Basically the fieldset groups the radio input together and gives the rounded corners only to the top and bottom items.

      Then you can add this JavaScript to your page:

      $('input[type=checkbox], input[type=radio]').filter(function ()
          return $(this).parent('.ui-checkbox').length == 0;

      This piece of JavaScript will select all checkboxes or radios inputs, filter the ones we already applied the plugin and then call checkboxradio() to change them to follow the jQuery Mobile style.  We then use the controlgroup plugin to group the controls together visually.  Once again we don’t re-apply this code to fieldsets that were changed previously, for efficiency reasons.

      For an horizontal look, simply add data-style=”horizontal” to the fieldset grouping the elements.  It is still a radio button list, but the layout is different.


      Checkbox list

      You can even do the same thing with checkboxes:


      Suddenly, your UI becomes much easier to use with a touchscreen.


      Here is a dropdown list, in the jQuery Mobile style:


      You simply execute the following JavaScript code:


      We had some problems with multi-selection lists, so that’s why we don’t automatically style those.


      Buttons are pretty straightforward. In our application, we decided to automatically transform all of our buttons which utilize the CSS class ‘button’.  This could be an input[type=button], input[type=submit], a simple link <a></a>.  We’re using a mix of these in our application and modify them all like this:



      Above, the two buttons are inline (add the data-inline=”true” attribute to the HTML elements). The Save button has a different theme (specify data-theme=”b”)


      You can also group buttons as above, by defining a horizontal controlgroup.

      <div data-role="controlgroup" data-type=”horizontal”>
          <a href="index.html" data-role="button">Yes</a>
          <a href="index.html" data-role="button">No</a>
          <a href="index.html" data-role="button">Maybe</a>

      Working with controls

      Visit this page to know operations you can do on your controls once they are modified by jQM: Form Plugin Methods. This reference is very useful if you want to enable/disable controls or refresh them with new values.

      ASP.NET WebForms

      If you are using ASP.NET WebForms as is our case, you want to run these plugins every time your page is modified.  If you are using ASP.NET UpdatePanels, then you can bind a function to the following event handler where you could modify your controls each time an UpdatePanel is updated:

          // Add the jQuery Mobile code modifing controls here

      Normally this job is done automatically by jQM, but since we are not using the controls inside a jQM page, we have to update the DOM manually after each postback.


      In its current form, jQuery Mobile is not compatible with Internet Explorer. Depending on the version of IE, it is easier completely unusable or doesn’t look as good (rounded corner issues). In our context (point of sale), we ended up utilizing Google Chrome Frame for our IE users – at least for the time being. The jQM team appears to be working towards full IE support for their beta release.

      Future of jQuery Mobile

      In conclusion, we loved working with jQuery Mobile once we figured out how to utilize bits & pieces of it individually. Currently, jQM focuses on development for mobile devices (which is normal) but we would be thrilled if they made integration into existing projects simpler. As this is an open source project, we weren’t afraid to peek at the code to figure out why it wasn’t working the way we intended it to. Let’s hope the project keeps on improving both its modularity and the desktop-based functionality.  Thank you to the jQuery Mobile team, continue the great work!


      The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

      © Copyright 2014

      Sign in