I just built a developer station for the company with specs substantially identical to the one I built for personal use a few months ago with an I7950 processor, 12GB RAM, 240GB SSD main drive, 1TB storage drive, and a Radeon HD5770 graphics card. I economized a bit by going with an 800 watt power supply, RAM not suitable for over-clocking and a cheaper case. The net result is a cost of less than $2,000 including a whole bunch of extras I didn’t have to buy for my personal computer like three brand new 23″ flat screens, DVI cables, UPS, keyboard, mouse, webcam etc. Apples to apples, it was about 33% less expensive than my last build. Amazing how fast computer prices fall.
Xoom Impressions
I took the plunge and bought the new Xoom Android 3.0 tablet four days ago and so far I am generally impressed. That’s not to say there are not substantial flaws like applications that don’t know how to handle the large screen (e.g. Mint), applications that don’t yet match their iPad counterparts (e.g. Skype without video support) and applications promised and not yet released (e.g. logmein Ignition and Flash). However, the good outweighs the bad. The tablet is fast and responsive, the built-in applications for web browsing, email and calendar are excellent and the screen is very good indeed.
I can certainly see myself traveling with the Xoom instead of a laptop as long as I don’t have to do heavy-duty development. For example, I was able to access a development environment hosted at Amazon with EC2 via RDP to do a little test, fix and patch for a C# application over an average broadband connection without the benefit of a Bluetooth keyboard. Although I would not try this with the current 3G wireless, I fully expect Verizon’s 4G (upgrade available soon) to be fully up to the task.
On the negative side, quite a bit of the potential of the Xoom is untapped right now. Besides 4G, early adopters will have to wait for Flash, support for the Micro SD slot and versions of popular applications that take full advantage of things like the front-facing camera and the large screen. Although overall stability is good, I did experience problems with some popular applications such as Skype and Mint.
Developing for the Xoom has been a good experience so far. I use Intellij with the Android SDK and developing against the device has been trouble-free. The emulator, on the other hand, is ridiculously slow even when running on my otherwise fast I7-950 desktop. For example, I experienced waits of up to three minutes when starting a simple hello world application on the emulator. I can’t quite understand why it has to be so much slower than the emulator for the phone form factors.
If you want to develop for Android tablets or hate big-brother Apple, you will be happy with the Xoom tablet as it exists right now. However, average users would probably be happier with an iPad 2. It is lighter and thinner, has more applications available and has a better UI. Although the Xoom has slightly better hardware , right now the software is a bit too rough around the edges to recommend an Android tablet over the iPad 2 for average users. I fully expect open source, hardware competition and Google to eventually trump the iEmpire, but for now Jobs and company still come out on top.
I wrote the original version of this post on the Xoom. Unfortunately, the open source WordPress Android application chopped up several of the paragraphs and inserted block quotes seemingly at random. I guess there is at least one more application in need of an upgrade.
Put Your Apps on the TopShelf
Many of my projects end up using a Windows service or three to host background processes. Over the years, I’ve developed a common-sense strategy of setting up a server class to contain the functionality that implements start and stop methods. I then create minimal command-line and windows service hosts to instantiate the server class and call start and stop when appropriate. This gives me a command-line server that can be conveniently started from the debugger and a windows service application for use in the production environment. Of course, this also means using InstallUtil when it comes time to install the service.
Today I stumbled across a much nicer solution in the open source TopShelf project. It lets me build a console application using about ten lines of code that hosts my server for development and provides a command-line to install as a Windows service so InstallUtil is not required. Highly recommended!
Cross-Domain Post With Silverlight or Why I Hate Hackers
All I wanted was a nifty Silverlight demo of my company’s new .NET package rating API for the commercial print industry. The main goal of the demo was to provide sample code that developers could look at to understand how easy it is to use. To keep the sample code simple, I wanted the Silverlight application to call the API directly. Should have been easy, right? Well, it wasn’t mainly because the API does an HTTP post to a site in a different domain and Silverlight just doesn’t allow that.
I’m not going to regurgitate the various ways you can get around this limitation. Just google search on “cross domain post silverlight” if you want technical details. I settled on a proxy approach, which I implemented grumbling all the way.
Grumbling because it only has to be this annoying because hackers have developed a number of malicious ways to use cross-domain posts to do nasty things. Again, I’ll spare you the details but you can follow the link if you really want them. Suffice it to say that because some people find it amusing, profitable or fulfilling to lie, cheat and steal, you have to jump through hoops to post something to an outside domain from Silverlight. It’s really the same reason I have to waste $20 per month on alarm monitoring for my home. If a very few people did not suck, we would not have to lock our doors, we would not have to install annoying virus checking software on our computers and we could do cross-domain posts from Silverlight without requiring the domain we are calling to have some magic file that we require.
End rant.
SSD + I7 950 + 12GB + Three Screens = Fast and Productive Development
I’ve been developing on the new computer I built for about two months now, and I am very happy with my investment. So far I have spent $2,100 and a couple evenings of personal time. The results are lighting fast compiles, the ability to run a couple of VMs without impacting the performance of my VS 2010 development environment and plenty of screen real estate to work with. In short, a much more productive development environment. The key parts are as follows:
- ASUS P6X58D-E Motherboard
- Intel I7 950 (lightly overclocked to 3.2Ghz)
- 3x4GB G.Skill DDR3 1600 RAM
- Corsair Force 240GB SSD (primary drive)
- Western Digital Caviar Black 1TB (data drive)
- Sapphire 1GB Radeon HD 5770 Video Card
I used to develop on a fast laptop since I had to travel quite a bit. Therefore, the only existing parts I could use were the external monitors. Since I already had them they are not included in the cost of the new computer.
- Two relatively cheap 1920×1080 23″ widescreens that I was using with the laptop at home ($199 ea about 6 months ago).
- One much older 19″ in portrait orientation (1024×1280). This monitor was sitting in my closet because I could only hookup two external monitors to my laptop.
The biggest improvements come from the SSD. I installed the OS, SQL Server and development tools on it, which left me close to 200GB free for my code. Compiles are dramatically faster, which gives me less time to sip my soda while I wait for my unit tests to run. Computer startup, whether from hibernate or from power off, is also greatly improved. The Corsair is a second generation SSD and includes TRIM support, which should keep performance consistent for the life of the drive.
VS 2010 + Resharper is also much snappier now thanks to the four cores with hyper-threading available on the I7-950. The development environment has been so fast that I was encouraged to turn on Resharper’s solution-wide analysis feature on my biggest projects for the first time. Surprisingly, I did not notice any slowdown. The Java IDE I use for Android development, JetBrains Intellij, is also running faster than before.
Adding a third screen in portrait mode has also improved my productivity. It lets me keep documentation in view at all times and frees the widescreens for development environment, GUIs I am debugging, VMs and remote desktop screens.
Speaking of VMs, plenty of RAM and CPU to spare lets me run a couple big ones without impacting the performance of the development environment. I tend to keep the VM disk images on my data drive, but even so I have noticed a small improvement in their I/O performance.
The other nice thing about the desktop is growth potential. I can double my RAM to 24GB by adding three new 4GB DIMMs. There is plenty of room for more hard drives and I even have a couple 6GB/sec SATA 3 ports to let me take advantage of the even faster SSD drives that are coming in 2011 and beyond. Mid-range video cards making their way to market now support up to eight monitors. Some day I can even upgrade the CPU with six or possibly more cores running a bit faster that the cores I have now. The bottom line is there is a good chance that this box will carry me to the day where my development environment lives in the cloud and all I need at home is a terminal with a huge screen.
Oh, the Windows experience scores are as follows:
- Processor 7.6
- Memory 7.9
- Graphics 7.4
- 3d Graphics 7.4
- Primary Disk 7.5
First Impressions of Silverlight
- It’s too bad I have to make a Silverlight-specific version of the API DLL. I know it would be easy to wrap in a web service, but I want the demo app to call the API directly.
- Creating my Silverlight API DLL is not as bad as I thought it would be thanks to VS 2010 linked files and a couple bits of conditionally compiled code specific to Silverlight.
- Caliburn Micro is a very nice lightweight open source MVVM library for Silverlight and WPF. Works on the Windows 7 phone too. The Soup to Nuts tutorial series is quite helpful.
- Silverlight is sure sticky about the foreground thread. You absolutely have to do your work in the background.
- Too bad I can’t use log4net. Yes, I know about Clog.
- Silverlight’s HttpWebRequest only supports asynchronous methods. I find this sort of annoying. I realize I need to put this processing in the background, but I think using a BackgroundWorker to wrap a synchronous HttpWebRequest would be much simpler.
- The Silverlight testing framework is pretty decent as long as you do not try to use it with Chrome. It has nice support for testing my asynchronous API. Too bad the asynchronous tests are not at all compatible with standard MSTest.
- Although the visual designer for XAML in VS 2010 is decent, it is often easier to hand-edit the XAML.
- Need to display source code with syntax highlighting. Jeff Wilcox has put together a nice syntax highlighting text block that does what I need.
Review of “Coders at Work” by Peter Seibel
As young as programming is, it seems that most of the practitioners in the field have little sense of the history of it. I’m not entirely sure why this is. Perhaps it is caused by the twin realities of rapidly evolving technologies and the relative youth of the folks that write code. No matter the reasons, it seems like nobody thinks very much about what happened in languages like Assembly, Lisp, Fortran and Smalltalk on platforms like early IBM mainframes, Altos and PDP-11s back in the dark ages before the dawn of the Internet age. It seems to me that it pays to understand where your field has come from. If you feel as I do about this, you should take the time to read “Coders at Work”, a collection of interviews with some of the all-time greatest programmers that most of us have never heard of.
Well, hopefully you’ve heard of some of them. Guys like Ken Thompson (co-creator of UNIX) and Donald Knuth (author of the masterwork “The Art of Computer Programming”) have set down bodies of work that demand attention. Some of them, like Brendan Eich (CTO at the Mozilla Foundation) and Joshua Bloch (Chief Java Architect at Google), might be well-known because of their current positions more than their past work. Others, like Fran Allen (2002 Turing Award winner), are probably unknown to 99% of the programmers on the planet. No matter their fame, every single one of them has fascinating things to say about where we’ve been and where we’re going as a profession.
I think what struck me the most as I read through the first-hand accounts was just how little computer these guys had to work with back in the early days. I thought I had it rough at the start of my career writing systems using HP3000 Basic and its two-character limit on variable names until I read about programming analog computers in the 50s and the early digital computers that supported as little as 200 words of memory. It makes you wonder what people will think of our mildly multi-core servers thirty years from now. It is also amazing how programming has remained just about as hard as it was back then. Sure, we have better tools now, but our users expect much more too.
Although this book does not offer the sweeping, dispassionate view point of a true history, it provides the invaluable personal perspective of the people that made history happen. Reading this book is about the closest many of us will ever get to joggling punch cards and toggling switches to enter code. It’s definitely worthwhile reading for every professional coder.
Check out “Coders at Work” at Amazon.
As young as programming is, it seems that most of the practitioners in the field have little sense of the history of it. I’m not entirely sure why this is. Perhaps it is caused by the twin realities of rapidly evolving technologies and the relative youth of the folks that write code. No matter the reasons, it seems like nobody thinks very much about what happened in languages like Assembly, Lisp, Fortran and Smalltalk on platforms like early IBM mainframes, Altos and PDP-11s back in the dark ages before the dawn of the Internet age. It seems to me that it pays to understand where your field has come from. If you feel as I do about this, you should take the time to read “Coders at Work”, a collection of interviews with some of the all-time greatest programmers that most of us have never heard of.
Well, hopefully you’ve heard of some of them. Guys like Ken Thompson (co-creator of UNIX) and Donald Knuth (author of the masterwork “The Art of Computer Programming”) have set down bodies of work that demand attention. Some of them, like Brendan Eich (CTO at the Mozilla Foundation) and Joshua Bloch (Chief Java Architect at Google), might be well-known because of their current positions more than their past work. Others, like Fran Allen (2002 Turing Award winner), are probably unknown to 99% of the programmers on the planet. No matter their fame, every single one of them has fascinating things to say about where we’ve been and where we’re going as a profession.
I think what struck me the most as I read through the first-hand accounts was just how little computer these guys had to work with back in the early days. I thought I had it rough at the start of my career writing systems using HP3000 Basic and its two-character limit on variable names until I read about programming analog computers in the 50s and the early digital computers that supported as little as 200 words of memory. It makes you wonder what people will think of our mildly multi-core servers thirty years from now. It is also amazing how programming has remained just about as hard as it was back then. Sure, we have better tools now, but our users expect much more too.
Although this book does not offer the sweeping, dispassionate view-point of a true history, it provides the invaluable personal perspective of the people who made history happen. Reading this book is about the closest many of us will ever get to joggling punch cards and toggling switches to enter code. It’s definitely worthwhile reading for every professional coder.
Check out “Coders at Work” at Amazon.
Review of “Founders at Work”
By the time I finished shutting down my company in March, 2009, I was seriously burned out on the idea of entrepreneurship. Starting a company is difficult. Selling off the bits and shutting it down after 11+ years is tougher. When I sat down and considered my future career path I really figured the only way forward was to go back and be an employee again – no more 70 hour work weeks, no more agonizing over the payroll and fewer sleepless nights. Sure, I’d lose some freedom and some financial upside, but the tradeoffs seemed like a no-brainer. I even turned down an opportunity to own a piece of a company that bought some of the IP from my former company and instead joined them as an employee.
Then I read “Founders at Work”, a collection of interviews with founders of famous technology companies. As I read through the stories, the few embers left over from the fire that had kept me self-employed for almost 20 years started to glow hot again. Each page felt like a personal conversation with one smart founder after another. My old heroes with their garage-startup VC war stories were all there — Steve Wozniak (Apple) and Dan Bricklin (Creator of VisiCalc) were two of my personal favorites. There were also great stories from Web 2.0-style founders that self-funded their companies like Joel Spolsky of Fogcreek. Every founder gave me a useful lesson. Every founder reminded me of something I liked about startups. By the end, the flame of entrepreneurship was burning hot in my gut once again.
How often do you find a book that inspires you? If you have ever considered starting a company or even joining a startup, this book is a must read. Just be careful; The inspiration, as Thomas Edison famously said, is only 1% of what it takes.
See this book at Amazon.com.
By the time I finished shutting down Objective Advantage in March, 2009, I was seriously burned out on the idea of entrepreneurship. Starting a company is difficult. Selling off the bits and shutting it down after 11+ years is tougher. When I sat down and considered my future career path I really figured the only way forward was to go back and be an employee again – no more 70 hour work weeks, no more agonizing over the payroll and fewer sleepless nights. Sure, I’d lose some freedom and some financial upside, but the tradeoffs seemed like a no-brainer. I even turned down an opportunity to own a piece of a company that bought some of the IP from my former company and instead joined them as an employee.
Then I read “Founders at Work”, a collection of interviews with founders of famous technology companies. As I read through the stories, the few embers left over from the fire that had kept me self-employed for almost 20 years started to glow hot again. Each page felt like a personal conversation with one smart founder after another. My old heroes with their garage-startup VC war stories were all there — Steve Wozniak (Apple) and Dan Bricklin (Creator of VisiCalc) were two of my personal favorites. There were also great stories from Web 2.0-style founders that self-funded their companies like Joel Spolsky of Fogcreek. Every founder gave me a useful lesson. Every founder reminded me of something I liked about startups. By the end, the flame of entrepreneurship was burning hot in my gut once again.
How often do you find a book that inspires you? If you have ever considered starting a company or even joining a startup, this book is a must read. Just be careful; The inspiration, as Thomas Edison famously said, is only 1% of what it takes.
See this book at Amazon.com.
Nant Task to Synchronize Scripts Folder to Database Using RedGate SQL Compare API
Redgate’s SQL Compare is an indispensible tool for anyone doing development against Microsoft SQL Server. It’s ability to export the database structure to a scripts folder and then synchronize changes from a database back to the scripts folder is a its greatest feature; It allows the database structure to be placed under source control with a high degree of granularity since each database object becomes a script file.
Our build process includes a step to deploy a test/demo version of our web to a server. As part of that process, we need to update the database structure from the scripts folder. Thanks to the SQL Compare API that was included with RedGate’s SQL Toolbelt, I was able to put together an NAnt task that does the job. Here is the actual Nant target from our build script:
You’ll notice the target uses a couple of properties to point at the location of the project root (${project.dir}) and the SQL Server instance name (${sqlServerInstance}). Since the task does not specify a user name and password, the schemaSync task uses the Windows security credentials of the user running the build to logon to the database. In the case of our automated build, the user account that runs the build process has access to the database. You can change to a SQL Server login by providing sourceDbUserName and sourceDbPassword properties to schemaSync.
The complete list of supported attributes is as follows:
Attribute Description sourceServerName If synchronizing from a database, the name of the source database server. sourceDbName If synchronizing from a database, the name of the source database. sourceDbUserName If using SQL Server authentication, the name of the user in the source database. Omit this attribute to use Windows authentication. sourceDbPassword If using SQL Server authentication, the password of the user in the source database. Omit this attribute to use Windows authentication. sourceScriptsPath If synchronizing from a scripts folder, the path to the scripts folder. destinationServerName The name of the server to synchronize to. REQUIRED. destinationDbname The name of the database to synchronize to. REQUIRED. destinationDbUserName If using SQL Server authentication, the name of the user in the destination database. Omit this attribute to use Windows authentication. destinationDbPassword If using SQL Server authentication, the password of the user in the destination database. Omit this attribute to use Windows authentication.
You can download the source code from my GitHub SouthSideDevToys repository.
RedGate’s licensing will require you to purchase the SQL Comparison SDK after a demo period. At this time, the product sells for $695 and you should be able to get a 15 day demo from RedGate.
The code is fairly self-explanatory. I wrapped an Nant task around a set of classes that works with the API. The Nant task assumes that the destination will always be a database even though the underlying code is capable of synchronizing to a scripts folder. If you have questions about the code, feel free to comment and I will respond with more information.
RedGate just released Data Compare 8.0 that allows data comparisons and synchronization with scripts. RedGate says that the next version of their API, due out at the end of August 2009 (UPDATE: Can be obtained by licensed customers through support), will include this capability as well. Our database includes configuration data that we currently rollout via SQL scripts. This process is a pain to maintain because each time the data changes we have to update our scripts and test them. As soon as RedGate’s new API becomes available I plan to change this process to use Data Compare via Nant so that everything is automatic. I’ll publish the Nant task as soon as I have it working.
Redgate’s SQL Compare is an indispensable tool for anyone doing development against Microsoft SQL Server. It’s ability to export the database structure to a scripts folder and then synchronize changes from a database back to the scripts folder is a its greatest feature; It allows the database structure to be placed under source control with a high degree of granularity since each database object becomes a script file.
Our build process includes a step to deploy a test/demo version of our web to a server. As part of that process, we need to update the database structure from the scripts folder. Thanks to the SQL Compare API that was included with RedGate’s SQL Toolbelt, I was able to put together an NAnt task that does the job. Here is the actual Nant target from our build script:
<target name="updateDatabaseStructure"> <schemaSync sourceScriptsPath="${project.dir}\db\OnpointConnect" destinationServerName="${sqlServerInstance}" destinationDbName="OnpointConnect"/> </target>
You’ll notice the target uses a couple of properties to point at the location of the project root (${project.dir}) and the SQL Server instance name (${sqlServerInstance}). Since the task does not specify a user name and password, the schemaSync task uses the Windows security credentials of the user running the build to logon to the database. In the case of our automated build, the user account that runs the build process has access to the database. You can change to a SQL Server login by providing sourceDbUserName and sourceDbPassword properties to schemaSync.
The complete list of supported attributes is as follows:
Attribute | Description |
sourceServerName | If synchronizing from a database, the name of the source database server. |
sourceDbName | If synchronizing from a database, the name of the source database. |
sourceDbUserName | If using SQL Server authentication, the name of the user in the source database. Omit this attribute to use Windows authentication. |
sourceDbPassword | If using SQL Server authentication, the password of the user in the source database. Omit this attribute to use Windows authentication. |
sourceScriptsPath | If synchronizing from a scripts folder, the path to the scripts folder. |
destinationServerName | The name of the server to synchronize to. REQUIRED. |
destinationDbname | The name of the database to synchronize to. REQUIRED. |
destinationDbUserName | If using SQL Server authentication, the name of the user in the destination database. Omit this attribute to use Windows authentication. |
destinationDbPassword | If using SQL Server authentication, the password of the user in the destination database. Omit this attribute to use Windows authentication. |
You can download the source code from my GitHub SouthSideDevToys repository.
RedGate’s licensing will require you to purchase the SQL Comparison SDK after a demo period. At this time, the product sells for $695 and you should be able to get a 15 day demo from RedGate.
The code is fairly self-explanatory. I wrapped an Nant task around a set of classes that works with the API. The Nant task assumes that the destination will always be a database even though the underlying code is capable of synchronizing to a scripts folder. If you have questions about the code, feel free to comment and I will respond with more information.
RedGate just released Data Compare 8.0 that allows data comparisons and synchronization with scripts. RedGate says that the next version of their API, due out at the end of August 2009 (UPDATE: Can be obtained by licensed customers through support), will include this capability as well. Our database includes configuration data that we currently rollout via SQL scripts. This process is a pain to maintain because each time the data changes we have to update our scripts and test them. As soon as RedGate’s new API becomes available I plan to change this process to use Data Compare via Nant so that everything is automatic. I’ll publish the Nant task as soon as I have it working.
Check Your Assumptions at the Door
Every year around this time I start going to McDonalds almost every day for lunch for one reason and one reason only: Monopoly. I’ve faithfully stuck game pieces to my little board every year in hopes of winning some valuable prize. I’m not greedy. I never dream about winning the big one. No, anything worth more than about $2 would be just fine with me. As you can guess, I’ve won plenty of fries and small cokes over the years but never the elusive $2+.
Anyway, these last couple of years they’ve gone to a web-based component located at http://www.playatmcd.com. It’s a rather pretty, Flash-based thing that asks you to put in annoyingly long codes from each of your stamps. Once it’s validated the code it lets you roll the dice and moves your piece around the board. All in all, it provides a perfectly satisfying McDonalds Monopoly experience. One particular feature of the game is that when you land on community chest or chance you get Coke rewards. When you land on the winning square, a message pops-up over the game board to inform you of your good fortune and a follow-up email shows up in your inbox with the info on how to logon to the Coke rewards site to claim your points. I never gave the feature much thought. I’d land on the square, I’d get the popup and a few minutes later the email would show up. It all seemed to work perfectly well until yesterday when I moved my email from Go Daddy to GMail.
It turns out the system isn’t very smart about the possibility of fast email. First, I rolled an 11 and while my piece was still moving GMail alerted me about a new message from McDonalds about a Coke reward. A little odd but not too bad. Clearly, the server sent the email at the same time it told the Flash client where to land my piece. Not perfect but probably unavoidable. The second thing that happened really bugged me. I rolled double threes and landed on a property. Just after landing, I got an email about another Coke reward, which did not make any sense since the game had told me nothing about winning another Coke reward. My second and last roll (no doubles this time) landed me on community chest, where I finally won the Coke reward I had received the email about a couple of minutes before.
So what does this tell me? Well, when the site receives the code from my game piece it must calculate the square where I will land. It might take one roll or it might take a couple. Regardless of how many moves it will take, it sends the prize email at the same time it tells the Flash client what to do. Either the Flash client decides how to get me to the right place in one or more rolls, or the server sends information about the necessary rolls to the Flash client. I never noticed this flaw in the logic before because my old email didn’t show up nearly fast enough to expose it. My guess is the developers of the site either missed this or figured nobody would notice. Once you see this happen the illusion that pressing the button to roll the dice means something is shattered and the game is no longer much fun. It makes me wonder why they bothered to implement the animated game board at all. After all, they have a facility that allows you to enter a code and see what you get without the animation.
At the end it all comes down to where bad assumptions can lead. The developers assumed I would not notice the pointlessness of rolling the dice. Given my reaction to what I saw, I guess somewhere in the primitive part of my brain I actually though the dice roll mattered.
I leave you with my favorite version of the old saying about assumptions.
Every year around this time I start going to McDonalds almost every day for lunch for one reason and one reason only: Monopoly. I’ve faithfully stuck game pieces to my little board every year in hopes of winning some valuable prize. I’m not greedy. I never dream about winning the big one. No, anything worth more than about $2 would be just fine with me. As you can guess, I’ve won plenty of fries and small cokes over the years but never the elusive $2+.
Anyway, these last couple of years they’ve gone to a web-based component located at www.playatmcd.com. It’s a rather pretty, Flash-based thing that asks you to put in annoyingly long codes from each of your stamps. Once it’s validated the code it lets you roll the dice and moves your piece around the board. All in all, it provides a perfectly satisfying McDonalds Monopoly experience. One particular feature of the game is that when you land on community chest or chance you get Coke rewards. When you land on the winning square, a message pops-up over the game board to inform you of your good fortune and a follow-up email shows up in your inbox with the info on how to logon to the Coke rewards site to claim your points. I never gave the feature much thought. I’d land on the square, I’d get the popup and a few minutes later the email would show up. It all seemed to work perfectly well until yesterday when I moved my email from Go Daddy to GMail.
It turns out the system isn’t very smart about the possibility of fast email. First, I rolled an 11 and while my piece was still moving GMail alerted me about a new message from McDonalds about a Coke reward. A little odd but not too bad. Clearly, the server sent the email at the same time it told the Flash client where to land my piece. Not perfect but probably unavoidable. The second thing that happened really bugged me. I rolled double threes and landed on a property. Just after landing, I got an email about another Coke reward, which did not make any sense since the game had told me nothing about winning another Coke reward. My second and last roll (no doubles this time) landed me on community chest, where I finally won the Coke reward I had received the email about a couple of minutes before.
So what does this tell me? Well, when the site receives the code from my game piece it must calculate the square where I will land. It might take one roll or it might take a couple. Regardless of how many moves it will take, it sends the prize email at the same time it tells the Flash client what to do. The Flash client adds some nice animation of rolling dice and shows the piece moving around the board. While that is going on, the email is making it’s way to my inbox, which is now so fast that I see the email before the piece stops moving. I never noticed this flaw in the logic before because my old email didn’t show up nearly fast enough to expose it. My guess is the developers of the site either missed this or figured nobody would notice. Once you see this happen the illusion that pressing the button to roll the dice means something is shattered and the game is no longer much fun. It makes me wonder why they bothered to implement the animated game board at all. After all, they have a facility that allows you to enter a code and see what you get without the animation.
At the end it all comes down to where bad assumptions can lead. The developers assumed I would not notice the pointlessness of rolling the dice. Given my reaction to what I saw, I guess somewhere in the primitive part of my brain I actually though the dice roll mattered.
I leave you with my favorite version of the old saying about assumptions.
You must be logged in to post a comment.