Monday, August 26, 2013

Full HP TouchSmart 15t-J000 Quad Edition Notebook Review

I finally decided to pull the trigger on a new laptop – this one mainly for at home use so a lot more powerful than my portable Lenovo Yoga 13 that I absolutely love. I mentioned before that I wanted to go lightweight for travel, but unfortunately I still haven’t found that ultimate laptop that has all of the specs in one package – i.e. fast SSD, high memory, thin form factor, etc. So, my trade-off will be to have a bigger laptop that I use here and take on the road only when needed, and then keep the lighter Yoga for when I travel or want to be portable around the house. The ASUS VivoTab which has served me well (I use it every day) will go to my daughter and I’ll use the Yoga in its stead.

I’m not much of a do-it yourself type person so I wanted something I could configure completely and order online. I’ve had tremendous success with electronics and yes, even computers, through Amazon.com (I am aware of NewEgg.com and TigerDirect.com, etc.) so I decided to go there. There are two different vendors I’ve worked with through the Amazon site and I’m completely satisfied with both. I received my Lenovo from Eluktronics, Inc.and the TouchSmart came from ULTRA Computers. In both cases I wanted something custom (for the Lenovo it was an i7 when the only listed model was an i5, and for the TouchSmart I wanted a graphics card added to the 500 GB SSD model). In both cases the sellers responded immediately, set up a SKU to satisfy my request, then built and shipped it immediately. Couldn’t be more pleased. I can’t recall the return policy from Eluktronics but ULTRA Computers also offered me a 60-day return policy.

Configuration

To be completely honest HP is not my first choice for a vendor – I’ve had a lot of success with ASUS, Dell, and Samsung products, and of course there is the MacBook series that can’t be ignored. However, after doing extensive research the only configuration I could come up with was through the HP product. Here’s the hardware specs I ended up with:

Intel Core i7-4700MQ (2.4 GHz, 6 MB cache, 4 cores with 8 logical) – turbo boost up to 3.2 GHz
Intel HD Graphics 4600 on the board but added the NVidia GeForce GT 740 with 2 GB dedicated memory
16 GB DDR 3 RAM
512 GB Samsung 840 SSD
RealTek 10/10/1000 Gigabit Ethernet LAN
Intel 802.11 b/g/n WLAN (does NOT appear to be dual-band), WiDi
4x 3.0 USB
1 full-size HDMI
1 RJ45
1 headphone/microphone combo
Multi-format card reader
15.6” diagonal LED-backlit touch screen 1080p
Full-size island keyboard with number pad
HP TrueVision HD WebCam
Integrated Dual Array Microphone
Beats Audio w/ 2 subwoofers, 4 speakers

Unboxing

Unlike the ASUS and Lenovo, the HP packaging was direct and to the point – not much invested in the presentation, but for me I could really care less. I want the goods. So here are the goods, sans boxes and packaging foam:

From the top:

WP_000510

Note the reflective sticker (you can see me in the reflection). I’ll share the first con here: build quality. Although this will work fine for me, some will expect a more expensive laptop to be solid throughout. The keyboard and bottom feel like solid aluminum, but tapping on the lid gives the impression it is plastic painted to look like metal. I’m not sure if it is, but it certainly gives that sound and feel. It’s a little like a cheaper Acer laptop I used to own – you can press hard on the lid and see artifacts on the display on the inside. Some people will run the other way because of this, but I’ve had plenty of laptops built this way and haven’t had an issue so it’s not a deal breaker for me.

Even though this laptop is larger (15”) and heavier (5+ lbs.) it feels nothing like the thick, brick-like Dell I had before. When just reading specs I was concerned, but upon receiving it, my fears were unfounded. There is a nice taper that despite the power it is packing gives it a slim face:

WP_000511

When facing the laptop, the left side has a security cable slot, vents, HDMI, USB 3.0 charging port, USB 3.0 port, digital card reader, and hard drive and power LEDs. This angle demonstrates the taper so you can see the tilt of the keyboard:

WP_000514

The right side features the dual audio out/in (microphone and headphones), 2x USB 3.0 ports, Ethernet status lights, full size RJ-45 jack, AC adapter light and the connector for the adapter.

WP_000512

Windows 8 and Support Woes

The second con was not really HP’s problem until I contacted support and discovered they were completely incapable of advanced troubleshooting. In the middle of installing my development environment, I did a reboot and suddenly found no Windows Store apps would launch. Apparently this is a widespread issue because a few searches found a lot of frustrated people who found their system suddenly stopped allowing them to launch Windows Store apps. The common things (latest display driver, ensuring more than 1024 x 768 resolution, etc.) were already addressed on my machine. Another common tact is to uninstall, then reinstall the apps – but my challenge was the Windows Store app itself wouldn’t launch (even after clearing the cache). I tried the system refresh and that failed. HP Support was clueless and obviously following a script, so I ended up just reinstalling Windows 8.

This was a little challenging as well. After installing it (happens FAST due to the SSD) I went onto the HP site and asked it to identify my system. It identified the wrong system so when I installed the Intel Chipset driver it froze my system. That meant a wipe and reinstall.

This time I specifically selected the model I knew it was, and the drivers worked fine. Once I got through that step the system worked well (even better without the bloatware on it) and I was very satisfied. A major “pro” with Windows 8 is the profile synchronization. After installing the OS, most of my tiles “went live” with data. All of the Windows 8 apps remembered my settings and just started working. I was surprised by how much actually synchronizes – for example, I went to the command prompt and it was configured exactly the way I like it (I make it use a huge font so it’s easy to present with).

Keyboard and Number Pad

The larger form factor gave HP room to provide a generous keyboard.

WP_000506

The keys are laid out nicely. I love the “island style” (keys are recessed so the lid closes flush without contacting them) and the keys have good travel. Having a full-sized number pad is also great. So how is the keyboard over all? My response here will be a bit loaded, so bear with me. If I hadn’t purchased a Lenovo Yoga I would say this is a great keyboard – one of the best I’ve typed on. I have read that some people find it loses key presses but that has not been an issue for me. It feels natural, I’m able to type extremely fast and don’t feel cramped. However, since I have used a Lenovo I can say it is still inferior to the Yoga keyboard – I still believe Lenovo makes the best keyboards in the industry. Anytime I feel the typing experience is great on the HP, I go over to my Yoga and suddenly it just feels better – the keys feel more solid, and the keyboard is a lot quieter than the HP.

The keyboard follows the ultrabook trend of wiring the function keys to actual functions rather than the function keys themselves, so a developer has to remember to hold down Fn + F5 when debugging for example. This can be turned off in the BIOS. The keyboard has substantial flex. When you press keys on the left side you can definitely see all of the keys around it moving. This is a deal breaker for some people. It doesn’t bother me and I wouldn’t have noticed it if I didn’t know to look for it but it is something to keep in mind (I suggest trying out a model if it’s a concern). The arrow keys are horrible – the up and down take up the space of a single key and I’m constantly missing the right one when I try to travel. I’ll need to retrain myself to use the arrows on the number pad instead. The page up/down/home/end keys are horizontal on the upper right while the Lenovo has them vertical so I am having to retrain myself there.

Touch Pad

I was pleasantly surprised by the touch pad. I’ve given up my mouse completely so I now exclusively use the touch pad, therefore it is important to me that it works well, is responsive and handles gestures for pinch, zoom, rotate, and scrolling. When I first started using the laptop I did not like the rough texture – the Yoga is glass and smooth so this felt a little odd to me. I had no issue with no separate left/right click buttons because that is how the Yoga is configured and I’m used to it. What is interesting is now that I’ve used it a couple of days, I’m more used to it now and the Yoga feels “slippery” with the glass. I guess it’s a question of what you use more.

The pad itself is incredibly responsive. I had read some concerns over quality but I haven’t found anything other than a glitch where two finger scrolling stopped working once. I had to reboot and it magically came back and hasn’t repeated itself but something to keep in mind. The Lenovo machines also suffered from these types of issues in the early days before the drivers stabilized. I use different gestures including Windows 8 app and charm bars and these all work flawlessly. Some people commented on the touch pad being off center. If you look at the image, while it is off center relative to the frame of the laptop, it is actually centered on the keyboard itself which is what is important and feels well-placed to me.

Wrist detection is fine and I have had no issues with accidently brushing the touch pad and having the cursor jump when typing. I also haven’t experienced the converse: i.e. the touch becoming unresponsive because a portion of my finger is resting on it. However, I must also add I try to type ergonomically which means I don’t rest my wrists on the keyboard – they brush it but I do keep them elevated.

This entire blog post was composed on the new laptop.

Display

The model I have comes with a 1080p touch display. The display itself is an interesting combination of pros and cons. So, first the cons. Again, I’m spoiled by the Yoga because it has a beautiful, clear display with incredible viewing angles. It is bright and readable from almost any angle and while it is not full matte so I do get some reflections, they don’t interfere with every day use. The HP display on the other hand is extremely glossy. I purposely took a picture that shows my reflection as well as a strange line of “light” across the right side which is the reflected sunlight through the edge of some blinds that are behind me. It is very reflective, more so than the Yoga, so I don’t imagine it would do well on a deck with sunshine. Fortunately, I typically have it set up in my office with no sunlight so it’s not a deal breaker but again something to look at.

WP_000505

The viewing angles are also limited. Vertical is not so bad (you can see from straight on to down, but don’t try to look “up” from the bottom or it will wash out) but horizontal is a very narrow range. I guess optimistically you could say it has built in privacy protection. Some people are very passionate about their viewing angles and I thought it would be a problem but honestly after several days of use I haven’t noticed them. I’ve used it both on my desk, on a table in the office and on my lap and it works fine. When viewed straight on, it is very bright and clear.

That segues into the pros: the display is very crisp and clear. I don’t even bump the font size despite it being 1080p because I love my large workspace and can read even 8pt fonts fine. I used a monitor calibration tool and it did very well across the board (again, provided I am in the right viewing window). For what I do – heavy development and writing – it is perfect. Honestly I feel like I fooled myself into settling for the Yoga’s 720p display because it is just so much more functional for me to have the full 1080p. When you are in the optimal viewing angles the display is superb, it only loses that quality when you are viewing at extreme angles which typically wouldn’t be the case anyway. Although the picture above shows the glossiness of the display, I worked for a full day in that same environment with no issues or distractions because it is clear and bright when viewed straight on.

Fingerprint Scanner

How cool is that? I thought this would be a more “nice to have” and gives someone bragging rights (I have Mission Impossible on my laptop) but now I’m going to be spoiled even more. It turns out despite the possible added security (and I know fingerprint readers have been around for quite some time) it is convenience that wins out. As a corporate computer it is constantly locked when I step away, and being able to unlock and log in with a simple swipe on my index finger is quite nice.

Audio

I didn’t really factor audio into my purchase decision at all. This is because I typically use headphones and microphones so the built-in is just a bonus. The model comes with Beats Audio which I assumed was sort of a gimmick – OK, great, two sub woofers and four speakers, wow (twirl finger in the air). Upon receiving it … WOW! The audio is fantastic. In fact, I switched from using Skype with headphones to using the speakers (I work from home, so no one to distract – if my daughter is on one of her classes in the next room I’ll go back to headphones for that). The sound is really, really clear and has a great range despite coming from a laptop. I am really impressed with how they engineered the audio. I played some Netflix with the audio jacked up all of the way and it sounded fantastic, including the lower ends. I can see this as an easy “set it on the table and watch videos without headphones” laptop.

The built-in microphones are great as well. There is an array on either side of the web came and I’m assuming it uses those for noise cancellation, etc. I know on Skype with the speakers blasting out and me talking, the other attendees told me there was absolutely no echo and my voice came through crystal clear. So checkbox for remote team collaboration with this laptop (Skype, GTM, and WebEx now fully tested).

Backlit Keyboard

The backlit keyboard was another thing I wrote off as a “senseless perk” and never paid much attention to. It’s an option on this but mine was fully loaded so they threw it in for me and … I like it! In fact, I love it. I didn’t realize how often I’d come into the office in the dark and not want to turn on a bright light but would fumble around on the keyboard. My testing was rebuilding it while watching The Hobbit with my daughter. Lights were off for full movie effect and the backlight allowed me to tap in commands occasionally when needed. I saw one review that complained about the bleed around the keys but it’s really only evident when you are looking at the keyboard from the front (when would you do that?) From a normal position it is perfectly fine. Here’s a snapshot of the keyboard with the lights on:

WP_000517

The only forehead slap with this is the way you turn it on and off. There is a function key wired to do this (see F5 in the image), but although the wireless (airplane mode, F12) key has it’s own little indicator for when wireless is on or off, the backlight has no indicator. If I were to engineer this keyboard, I’d have a little light on the backlight function key so you can find it in the dark to turn on the keyboard. As it  is you have to fumble around or tilt the screen to find it, THEN once it’s on the keys are all illuminated.

Performance and Compatibility

So here is where it gets really exciting. The performance of this laptop is outstanding. It features the latest Intel 4th generation “Haswell” chips, a great integrated video card (not one for extreme gamers but fine for a discrete GPU on a development machine) and a blazing fast SSD. The SSD is the Samsung 840 series and it whirs. My “non-standard” benchmarks were taking publication of a database that includes populating tens of thousands of sample records from 3 – 4 minutes on my Lenovo Yoga to under one minute on this machine. Builds go from 5 – 7 minutes for a 40+ project solution down to 2 – 3 minutes. Installing office (yes, ALL of Office) took about 5 minutes. Visual Studio took longer to download than to install. Bottom line: it performs well.

I had no issues for drivers for anything I plugged into it – my Targus USB was recognized immediately, and so was the Lenovo ThinkVision mobile I use when on the road. My headphones connect over BlueTooth without any problems.

I’m sure you can look up various benchmarks and other performance specs online but here’s the Windows Experience Index from my laptop to give you an idea of how it does:

experienceindex

The Intel 4600 built-in GPU is what kicks the overall score down. You can see the NVidia is slightly better and then everything else is great. The SSD really makes a difference – it’s almost twice as fast as the one in my Yoga. When I go onsite and power on my laptop, it’s literally turn it on, wait a few seconds, and swipe my index finger. I’m there and ready to go.

Battery

This is something I haven’t tested extensively. My brief test involved switching to the integrated graphics card then watching full screen video. I used the Beats Audio for a portion and then switched to head phones. After 3 hours the battery showed about 45 minutes remaining but the estimates seemed to consistently run low so I’m guessing I would have had another few hours. It’s enough to power the laptop through a flight from Atlanta to Seattle, so that works for me.

Conclusion

I had some trepidation when I made the purchase. I had read some negative reviews (the computer is overall rated very high on HP’s own site and Amazon) and had a bad experience with the Windows Store issue, but once I worked through that and used it, I definitely think it’s a keeper. If you don’t need the memory and 4th generation processors then I’d suggest going for a Yoga or Samsung Chronos (and if you can wait, Samsung will likely come out with more powerful configurations) but otherwise this is the best performance I could find that still maintains a good form factor despite the size. The only two things I’d change would be better viewing angles on the display and a Lenovo keyboard, but until Lenovo gives me an option with the 4th generation processors that has 16 GB of RAM and is SSD equipped, I’m sticking with this one. This is definitely the development workhorse I was hoping for. I’ll give my Asus to my daughter, and my Yoga will become my “convenience” laptop that I keep downstairs for social media and quick updates or watching videos while I keep this one plugged into my Targus USB 3.0 docking station so I get a full three monitors.

Wednesday, August 14, 2013

Testable Filters with TypeScript, AngularJS and Jasmine

The T6502 Emulator displays a set of registers to indicate the status of the program counter (where in memory the CPU is looking for the next set of instructions), the values of registers (temporary storage in the CPU) and a special register called the “processor status.” The processor status packs a lot of information into a single byte because it contains flag that indicate whether the last operation dealt with a zero value, a negative value (based on “two’s complement” addition whereby the high-order bit is set), if there as a carry from an addition, and so forth. It makes sense to display this register as a series of individual bits to see what’s going on.

The initial status is rendered like this:

PC SP A X Y NV-BDIZC Runtime IPS
0x200 0x100 0x0 0x0 0x0 00000000 0.000 0

To see how this is done through AngularJS, take a look at the source code:

<table>
    <tr><th>PC</th><th>SP</th><th>A</th><th>X</th><th>Y</th><td>NV-BDIZC</td><th>Runtime</th><th>IPS</th></tr>
    <tr>
        <td>{{cpu.rPC | hexadecimal}}</td>
        <td>{{cpu.rSP | hexadecimal}}</td>
        <td>{{cpu.rA | hexadecimal}}</td>
        <td>{{cpu.rX | hexadecimal}}</td>
        <td>{{cpu.rY | hexadecimal}}</td> 
        <td>{{cpu.rP | eightbits}}</td> 
        <td>{{cpu.elapsedMilliseconds / 1000 | number:3}}</td>               
        <td>{{cpu.instructionsPerSecond}}</td>               
    </tr>
</
table>

Notice the special braces for binding. They reference an object (the CPU) and a value. The pipe then passes the contents to a filter, in this case one called “eightbits” because it unrolls the status into its individual bits.The binding happens through scope. Scope is the glue between the model and the UI. Conceptually, scope looks like this:

angularscope

The scope was set to an instance of the CPU in the main controller:

$scope.cpu = cpuService.getCpu();

A powerful feature of angular is the ability to separate the declarative UI logic from the imperative business and presentation logic. In the case of the register, we want to show the individual bits. However, in the CPU model it is truly a byte register. The model of our CPU shouldn’t have to change just because someone wants to see the register in a different way – that is a design decision (it is part of the user interface and how the user experiences the information). Exposing the individual bits on the model is the wrong approach. Instead, we want to create a filter which is designed specifically for this scenario: manipulating output.

For convenience, I created a Main module that exposes a universal “App” class with some static fields. These fields make it easy to access and register modules. Keep in mind this is not necessary – you can access a module from anywhere within an angular app simply by calling angular.module() and passing the name. However, I find for both tests and the production app having something like this eases development and makes the references fast and easy.

module Main {
    export class App {
        public static Filters: ng.IModule = angular.module("app.filters", []);
        public static Directives: ng.IModule = angular.module("app.directives", []);        
        public static Services: ng.IModule = angular.module("app.services", []);
        public static Controllers: ng.IModule = angular.module("app.controllers", ["app.services"]);    
        public static Module: ng.IModule = angular.module("app"
            ["app.filters", "app.directives", "app.services", "app.controllers"]);    
    }
}

Note the “main” module ties in all of the dependent modules, but the filters are in their own module that can run independently of everything else. I’m using definition files from Definitely Typed to make it easy to discover and type the functions I’m using within angular from TypeScript. Now the filter can be defined. First, I want to define a spec for the filter. This describes how the filter will behave. To do this, I integrated Jasmine with the solution. Jasmine is self-described as “a behavior-driven development framework for testing JavaScript code.” It is also easy to use TypeScript to generate Jasmine tests.

The test harness simply includes angular, Jasmine, the main “app” that provides the statically registered modules, and then whatever individual pieces of the app I wish to test. In the case of the filter, I decided to call it “eightbits” so the eightbits spec looks like this (all of the source is available via the T6502 CodePlex site):

Note that I really was concerned with three cases – this is by no means an exhaustive test of every possible permutation. I want to be sure invalid input is simply echoed back, that valid input is displayed as bits and that if I have a small number that the bits are appropriately padded with zeroes so I get 00000001 instead of 1.

module Tests {

    describe("eightbits filter", () => {
  
        var filter: any;

        beforeEach(() => {    
            module('app');          
        });

        beforeEach(() => {
            inject(($filter) => {
                filter = $filter('eightbits');
            });
        });

        describe("given invalid input when called", () => {
            it("then should return the value back", () => {        
                expect(filter('zoo')).toEqual('zoo');                
            });
        });

        describe("given valid input when called", () => {
          
            it("then should return the bits for the number", () => {
                expect(filter(0xff)).toEqual('11111111');                
            });
            
            it("with smaller number then should pad bits to 8 places", () => {
                expect(filter(0x01)).toEqual('00000001');                
            });          
        });
    });
}

Let’s break down what happened. First, I described the test suite (eightbits filter). I provided a variable to hold the instance of the filter. Before each test, I run the module alias. This is provided by angular-mocks.js and enables us to stand up modules for testing. Next, I use the inject method to handle dependencies. $filter is a specific angular service. By passing it as a parameter to the injection method, angular will look at any dependencies wired up so far and provides the service. This service allows me to ask for a filter by name, so when the filter is registered, the injector will pick it up and provide it to me.

Now that I have an instance of the filter, the test conditions play through. When the filter is passed zoo, we want zoo to bounce right back. When it is passed a byte with all bits set, we want that to reflect in the result, and when I pass a single bit set I check for the padding. Of course, we haven’t built the filter yet so all of these tests fail (but you may find it interesting that this will compile, since I’m referencing the filter via the filter service and not as a direct reference).

I can now write the filter – note that the filter registers itself with angular using the same name I used in the test.

module Filters {

    export class EightBitsFilter {
       
        public static Factory() {
            return function(input): string {
           
                var padding: string = "00000000";
                
                if (angular.isNumber(input)) {
                    var result = padding + input.toString(2);
                    return result.substring(result.length - 8, result.length);
                }

                return input;
            }
        }
    }

    Main.App.Filters.filter("eightbits", [EightBitsFilter.Factory]);
}

I am using a factory pattern that provides a way for angular to create an instance of the filter. Angular will call this and keep track of the instance and inject it anywhere it is used. After the definition of the filter, I get a reference to the angular module for filters, and call the filter method. This is passed the name of the filter and its dependencies, which in this case is just the factory to create it. The signature provided by angular is to take in a string and return a string. I check to see if it is a number (otherwise I just return the value, as in the “zoo” case), then I cast it to binary and pad it as necessary.

Here I’ve been able to test a filter that is used in the UI. I was able to provide a spec for it that describes all of my expectations, so anyone looking at the test knows how it is supposed to behave. And, by registering the filter, I no longer have to worry about how bits are exposed by my model. Instead, I simply bind them to the scope and pass them through the filter to appear in the UI.

If you’re not convinced how powerful this feature is, imagine an enterprise customer that requires you to format the dates in a special format. They are still arguing over the exact date format, so instead of waiting or risking refactoring, you build a filter. For now, it simply spits out the date. Down the road, you get the details for how to display it but the date must be formatted differently in various areas of the application. Is that a problem? No! You can easily create a service that represents where the user is, then allow the directive to query the service and format the date accordingly. You make a change in one, possibly two places, and roll out the change, rather than having to search every place in the application you used the date to make an update. Starting to see the value in this?

Many posts cover services, filters, and directives in isolation, so in my next angular-related post I’ll share how to use a service in combination with a directive to create a testable UI artifact that your code can interact with independent of how the UI is actually implemented.

Monday, August 12, 2013

Handling Windows 8 Orientations in Windows 8.1

Windows 8.1 eliminates the concept of a “snapped” or “filled” view and allows apps to run at a variety of sizes. The minimum default size is set to 500 pixels wide, but this can be overridden for legacy apps or apps designed specifically for the narrower resolution. The changes can make migration difficult, however. If you built your app using the built-in LayoutAwarePage class, you’ll find the functionality has changed significantly and there is no automatic trigger for orientation changes that map to visual states.

Ultimately, you will end up migrating your apps to the new system. Realistically, you may have dozens of screens that are expecting to adapt to the screen orientation based on legacy code, and it would be nice to reuse all of the effort that went into these styles. Should you just chuck the code and start from scratch? There may be a way to salvage those old visual states.

I was recently working on converting some of my sample projects from Windows 8 to Windows 8.1 and came across a project that demonstrates the Visual State Manager (VSM). It used a combination of some styled controls and the actual page orientation to do this. First, a quick review of the Windows 8 screen orientations as defined in the basic template:

Default or Full Screen Landscape

This is your app, running full screen, in a typical configuration assuming the tablet or laptop is in a typical landscape orientation.

screendefault

Filled

This is the landscape orientation when your screen is next to another app that is in the snapped (small) mode.

screenfilled

Portrait

This happens when you turn the tablet so it is taller than it is wide, like a sheet of paper. It’s more of a “book” or reading orientation.

screenportrait

Snapped

Finally, you can “snap” your app which places it to the side with limited functionality to run side-by-side with another app.

screensnapped

In Windows 8.1, the concepts of “filled” and “snapped” go away. You can have multiple apps running side-by-side with a minimum (default) resolution of 500 pixels. You are either in landscape or portrait mode, and that’s it. This, of course, doesn’t help us migrate our existing apps that were relying on the other modes.

When working on this particular app, I decided to think about the modes as still existing in Windows 8.1, with new definitions, like this:

  • FullScreenLandscape – landscape and full screen, of course
  • Filled – landscape and not full screen
  • FullScreenPortrait – technically, any type of portrait orientation
  • Snapped – the minimum width (500 pixels)

With this set of definitions in mind, I decided the easiest way to migrate existing pages would be to build something that replicates what the old template used to do. The main thing it did was listen for changes to the layout and update the orientation by calling the visual state manager. It turns out we can do that in Windows 8.1 and do it in a way that doesn’t require us to inherit from a base page or even duplicate code on multiple pages. Instead, we can create an attached property and attach it to the pages we want to be “layout aware.”

The attached property is defined on a static class called OrientationHandler. The HandleOrientation property determines whether or not the legacy behavior should be applied. Set it to true to use it (I can’t imagine why you’d attach it at all if you’re going to set it to false).

public static readonly DependencyProperty HandleOrientationProperty =
    DependencyProperty.RegisterAttached(
        "HandleOrientation",
        typeof(bool),
        typeof(OrientationHandler),
        new PropertyMetadata(false, OnHandleOrientationChanged));

public
 static void SetHandleOrientation(UIElement element, bool value)
{
    element.SetValue(HandleOrientationProperty, value);
}

public
 static bool GetHandleOrientation(UIElement element)
{
    return (bool)element.GetValue(HandleOrientationProperty);
}

Next, a property is used to keep track of the previous state so that it doesn’t keep transitioning when a size change doesn’t result in a new orientation.

public static readonly DependencyProperty LastOrientationProperty =
    DependencyProperty.RegisterAttached(
        "LastOrientation",
        typeof(string),
        typeof(OrientationHandler),
        new PropertyMetadata(string.Empty));

public
 static void SetLastOrientation(UIElement element, string value)
{
    element.SetValue(LastOrientationProperty, value);
}

public
 static string GetLastOrientation(UIElement element)
{
    return (string)element.GetValue(LastOrientationProperty);
}

When the property is attached to a control and set to true, it hooks into several events to evaluate the orientation.

private static void OnHandleOrientationChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
{
    var control = d as Control;

    if (control == null)
    {
        return;
    }

    control.Loaded += (sender, args) => SetLayout(control);
    control.LayoutUpdated += (sender, args) => SetLayout(control);
    control.SizeChanged += (sender, args) => SetLayout(control);
}

Finally, a method applies the layout algorithm I described above to determine the orientation and, if it is a new orientation, transitions to the appropriate visual state.

private static void SetLayout(Control control)
{
    var orientation = ApplicationView.GetForCurrentView().Orientation;
    string newMode;

    if (orientation == ApplicationViewOrientation.Landscape)
    {
        newMode = ApplicationView.GetForCurrentView().IsFullScreen ? "FullScreenLandscape" : "Filled";
    }
    else
    {
        newMode = Window.Current.Bounds.Width <= 500 ? "Snapped" : "FullScreenPortrait";
    }

    if (newMode == GetLastOrientation(control))
    {
        return;
    }

    VisualStateManager.GoToState(control, newMode, true);
    SetLastOrientation(control, newMode);
}

That’s it – now I can take the existing XAML along with the legacy visual states and plug it into Windows 8.1 by attaching the property like this (see the last line):

<common:Page
   x:Name="pageRoot"
   x:Class="VisualStateExample.MainPage"
    DataContext="{Binding DefaultViewModel, RelativeSource={RelativeSource Self}}"
   xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
   xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
   xmlns:local="using:VisualStateExample"
   xmlns:common="using:VisualStateExample.Common"
   xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
   xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
   mc:Ignorable="d"
   local:OrientationHandler.HandleOrientation="True">

You can download both the Windows 8 and Windows 8.1 versions of this code by referencing the source code at this link – the code is from the VisualStateExample project in Chapter 3.

Friday, August 9, 2013

Commodore 64 Meets AngularJS and TypeScript

In case you haven’t noticed, I’ve been spending a lot of time working with a combination of technologies that I believe make it easier and more efficient for large development teams to build enterprise web applications. TypeScript gives JavaScript some discipline and AngularJS brings XAML-like declarations to HTML. Although I’ve been using this power-packed duo in a large system for months now, it is a proprietary code base so I can’t use a lot of what I use in my day-to-day activities as examples for this blog. Small, targeted examples like the ones I share in my WintellectNOW video Fundamentals of AngularJS are helpful to understand the system but at some point you need a larger example to see how everything ties together.

I decided if I was going to take all of my free time left over between consulting full time, writing a book, authoring videos, training, and letting my wife and daughter catch a glimpse of me on occasion to write an example application, I might as well make it fun. I’ve always wanted to write a Commodore 64 emulator and this was the perfect excuse. To be fair, this is more of a 6502 instruction set simulator and there is no hardware emulation (it also doesn’t handle cycle timing) but it was a fun project to run nonetheless and demonstrate how these technologies were used.

The T6502 emulator (hint, that link will get you to a working version) is still very much a work in progress. You can download the latest source here. I have some sample programs that will load and run, including a few graphics routines as well as “test programs” that verify the accuracy of the emulation. The packed decimal program fails now, but I don’t suspect I’ll waste too much time tracking that down as it is such a rare mode. I also don’t have all op codes in place (I’ve taken the approach of finding the program I want to run, then building out the op codes that support it).

I’m going to use this reference software to write a series about building apps with TypeScript and AngularJS. In this introductory session, I want to share why I believe this is the right set of tools for building enterprise web apps. First, let’s discuss the “what” …

What is It?

The software emulates the 6502 instruction set. It features a compiler that recognizes basic assembly with support for setting the memory location and assigning / using labels and literals in both decimal and hexadecimal format, a disassembler, a memory dump feature, a debugger (yes, you can step through op codes line by line), a dashboard that displays the current program counter and register values, a console for diagnostic information and output, and a 32 x 32 display that uses SVG to emulate a memory-mapped display with a 256-color palette (I realize this is not how the Commodore worked but hey, this is 2013).

The assembler supports the use of DCB to load bytes (i.e. DCB 0x02, 0x03, 0x04 will load those values into subsequent memory locations). The video memory is mapped starting at $FC00 and goes left to right, top to bottom. There is a zero page address for random numbers and I plan to add a character input, character output (to the console), and clock for some more interesting programs. The default program start is $200, just above the stack. The reason I mapped video to $FC00 is because you can BNE off the high byte to know you’re off the edge so to speak.

How is it Done?

This is a bit more interesting. I took a Test-Driven Development (TDD) approach although some components I did write before tests. The majority of op codes and all of the compiler features were written this way. I used Jasmine for client-side tests (just click on the specs links). For the compiler, I might write a test for a new feature I wanted to implement like this:

describe("given compiler when decompiler called with code", () => {

    beforeEach(() => {
        var jmpAbsolute = new Emulator.JmpAbsolute();
        var address = cpu.rPC;

        // $0200: JMP $0200
        cpu.poke(cpu.rPC, jmpAbsolute.opCode);
        cpu.poke(cpu.rPC + 1, address & Constants.Memory.ByteMask);
        cpu.poke(cpu.rPC + 2, address >> Constants.Memory.BitsInByte);
    });

    it("then should return the decompiled code", () => {        
        var actual = compiler.decompile(cpu.rPC);
        expect(actual).toContain("$" + toHexAddress(cpu.rPC) + ": JMP $" + toHexAddress(cpu.rPC));
    });
});

This would cause the test to fail. Then, I’d update the compiler to implement the feature:

public decompile(startAddress: number): string {
            
    var address: number = startAddress & Constants.Memory.Max;
    var instructions: number = 0;
    var lines: string[] = [];

    while (instructions < Constants.Memory.MaxInstructionsDecompile &&
address <= Constants.Memory.Max) {
            
        var opCode: number = this.cpu.peek(address);

        var parms = [
            opCode,
            this.cpu.peek(address + 1), 
            this.cpu.peek(address + 2)
        ];

        var operation = this.cpu.getOperation(opCode);

        if (!operation) {
            operation = new InvalidOp(opCode);    
        }

        lines.push(operation.decompile(address, parms));

        instructions += 1;
        address += operation.sizeBytes;
    }

    return lines.join("\r\n");
}

And run the test. This is also how I did the op codes. There are hundreds of tests and I’ll add more as I evolve the instruction set. I’m using some of the cool TypeScript features as well, including generics:

$http.get(url).then((result: ng.IHttpPromiseCallbackArg<string>) => {
    this.$scope.compilerInfo = result.data;
});

Interfaces to describe what I’m interacting with (so the CPU just uses the interface and doesn’t care about the implementation):

export interface IOperation {
    execute(cpu: Emulator.ICpu);
    addressingMode: number;
    opCode: number;
    opName: string;
    sizeBytes: number
    decompile(address: number, bytes: number[]): string;
}

Inheritance to avoid repeating code:

export class BaseOpCode implements IOperation {
        
    constructor(
        public opName: string
        public sizeBytes: number
        public addressingMode: number
        public opCode: number) {        
    }

    public decompile (address: number, bytes: number[]): string {
        return OpCodes.ProcessLine(address, this, bytes);                
    }

    public execute(cpu: Emulator.ICpu) {
        return;
    }
}

And classes to organize op codes into testable units:

export class AddWithCarryZeroPageX extends BaseOpCode {
    constructor() {
        super("ADC", 0x02, OpCodes.ModeZeroPageX, 0x75);
    }
    public execute(cpu: Emulator.ICpu) {
        OpCodes.AddWithCarry(cpu, cpu.peek(cpu.addrZeroPageX()));
    }
}

When you look at the main page, you’ll see the AngularJS at play. For example, the display of stack registers a hexadecimal filter to show the hex values and a bit filter to show the individual bits in the processor status byte (oh design people don’t hate me for resorting to tables):

<tr>
    <td>{{cpu.rPC | hexadecimal}}</td>
    <td>{{cpu.rSP | hexadecimal}}</td>
    <td>{{cpu.rA | hexadecimal}}</td>
    <td>{{cpu.rX | hexadecimal}}</td>
    <td>{{cpu.rY | hexadecimal}}</td> 
    <td>{{cpu.rP | eightbits}}</td> 
    <td>{{cpu.elapsedMilliseconds / 1000 | number:3}}</td>               
    <td>{{cpu.instructionsPerSecond}}</td>               
</
tr>

And then both the console and the display use Angular directives:

<div class="column"><console></console></div>
<
div class="column">
    <div style="width: 200px;">&nbsp;</div>
    <display></display>
</
div>

I’ll continue to post to this series and describe how I built the compiler and step through TypeScript and AngularJS as a part of that. For now I wanted to share where it’s at and what it is.

What It’s Not

There is a large list of things “to do” but I will eventually get to them as this is a fun hobby for me. I’ve always wanted to write an emulator and find it ironic that I’ve finally tackled it using web technologies like SVG and JavaScript. There are tons of op codes not implemented yet – I used sample code and built up to support it and am just now walking through the entire stack so that will take some time unless someone joins that project and helps extent them. I haven’t implemented the zero page magic yet like recognizing a key press or providing a timer. These are just repeatable extensions to the base that’s there, however, and the main structure, processor, logic, etc. is all there.

What’s Next?

I’m looking to wrap up as much of this as I can by DevLink to use in my talk about AngularJS and JavaScript, but I plan to release a series of blog posts now that I have the example application finished. You can find tons of how-to for TypeScript and AngularJS on the web, so for this I’m going to tie it directly to the project – for example, how did I solve the problem of having a console that components can write to without depending on the directive that renders it? How did I create a testable filter to show hexadecimal? How did I implement the pixel-based display? Why use an interface? What do generics in TypeScript solve for me? I hope these practical, hands-on examples will help you with your projects.

That’s it for now, I’m off to work on my Windows 8.1 book but I’ll be back with my first post “in the details” soon.