Author: JK

Josh is a software developer who lives in Littleton, CO.
HTTP/2 is Awesome

HTTP/2 is Awesome

At the West Side Web Shredders most recent Meetup we went over the features and benefits of HTTP/2. HTTP/2 brings a lot of exciting features to the web to keep up with modern demands.

I encourage you to go through the deck of information we put together on our GitHub site.

Using Google Analytics with your SPA Angular 2 Web App

Using Google Analytics with your SPA Angular 2 Web App

In order to track navigation in your Single Page Application, it takes a little more work up front. This article explains how to use Google Analytics with an Angular 2 Web App.

First you’ll need to register your web domain with Google. There a few different ways to verify your site. Once verified you’ll be able to download a JavaScript snippet to add to your site. With a traditional (non-SPA) site, you just drop this into your layout template and you’d be good to go.

The technical details of how tracking SPA navigation is detailed in this guide with Google. The key to grab the creating key from the snippet. We’ll use that later in the implementation.

ga('create', 'UA-69737250-1', 'auto');

First add this to your index.html. This will be the only modification to that page.

<script async src='https://www.google-analytics.com/analytics.js'></script>

Next modify your package.json include the autotrack library from Google. 1.0.3 is the version available when this post was written.

"autotrack": "~1.0.3",

Next we’re going to create a service. This will be a global service instantiated when our app first loads.

import {Injectable} from '@angular/core';
require('autotrack');

@Injectable()
export class AnalyticsService {
    constructor() {
        (window).ga=(window).ga||function(){((window).ga.q=(window).ga.q||[]).push(arguments)};(window).ga.l=+new Date;

        (window).ga('create', 'UA-69737250-1', 'auto');
        (window).ga('require', 'cleanUrlTracker');
        (window).ga('require', 'eventTracker');
        (window).ga('require', 'outboundLinkTracker');
        (window).ga('require', 'urlChangeTracker');
    }

    pageView(url: string) {
        (window).ga('set', 'page', url);
        (window).ga('send', 'pageview');
    }

    //to track clicks on html attributes
    //ga-on="click"
}

If you have a app.module.ts, or a similar file, where you’re bootstrapping your application, add your service there to the providers: [] array.

From your component where you want to track the navigation, import the reference and inject it into your component class. Then on the ngOnInit log the page navigation. (This is not a full component code sample. Just the pieces you need for it to work).

import {AnalyticsService} from './../services/analytics.service'; //path will likely differ

  constructor(private _analytics: AnalyticsService) {
  }

  ngOnInit() {
    this._analytics.pageView('/login.html');
  }

That’s all there is to it. If you want to see if it’s working you can view all network traffic from the Chrome Developer Console, or of course log into your Analytics account to see if the traffic is being captured.

My Experience with Angular 2

My Experience with Angular 2

I’ve been working with Angular 2 since the early betas, and now into the release candidates. Working with a new platform (yes Angular 2 is a platform vs. a framework) during beta releases is a bit painful. Even working through the release candidates (1-4 so far) has been trying as well. You have to know what you’re signing up for, be patient, and willing to learn new things quickly.

To be the most productive when developing in Angular 2, you should be developing with Typescript. If you’ve done ES6 (ES2015) recently, then Typescript will feel right at home. The Typescript learning curve is pretty small, so don’t be intimidated by it. Trust me, it’s worth learning. It’s worth it because Angular 2 is written in Typescript and most of the examples you’ll see on the blogs and overflows are in Typescript.

So how does it stack up against Angular 1? For starters, it’s much simpler to learn. There are 50 some less built in directives in Angular 2. There are no factories, just services (which are now called providers). Directives and Controllers are now pretty much the same thing which are called components. Components have a selector attribute, which is the tag name the component represents (such as <registration>). Directives are there just for element attributes (<button sound-alert-on-click>) with sound-alert-on-click being the directive.

Routing is pretty straightforward, but the router is still in beta while the Angular 2 platform is currently in release candidate 4 (July 10th 2016). The new router isn’t hard to learn at all.

Forms and validation still seem to be in flux too. They were changed a little bit in release candidate 3, so hopefully they will stable now.

Binding and event handling is different, but not hard to pick up if you’ve done Angular 1.x development before. The main difference is that you can bind to any dom event and any standard html element attribute without using an Angular directive, like ngModel. So for example if you want to bind to the click event of a button, you do <button (click)=”submit()”>Submit</button>. submit() is simply a method in your class that represents your component.

Binding is isolated more in Angular 2 with use of Zones, made in party by zone.js. zone.js is what Angular 2 relies on for binding. Without zone.js in your project, binding won’t work.

What’s the hardest thing to learn in Angular 2? I would have to say Observables. Angular 2 uses the RxJS framework in place of promises. What the difference between observables and promises? I suggest watching this video from ng-conf. It’s the best explanation I’ve seen yet. You can get going without really knowing a lot about RxJS, but I suggest digging into it. I’ve already used for a global event provider that was extremely useful in my application.

One last learning curve you should be aware of is the CLI (command line interface) to build your Angular 2 project. ES5 should be not be used when building a Angular 2 application (even though the build process spits out ES5). So if you’re using ES6, you’ll be using babel. If you’re using Typescript, you’ll need to transpile that to ES5 as well. I’ve been using Webpack along with the Angular 2 Webpack starter. Webpack is a little overwhelming at first, but once you get what it’s doing you realize how powerful it is and how much you love it 😍.

You can also go with the Angular CLI project which is a lot simpler and straight forward, but not near as powerful as Webpack.

Nativescript recently added support for Angular 2. So you can now use Angular 2 to build native iOS and Android apps. I’ve also heard about a little hack that will allow Nativescript to build an app for Apple TV, so you can technically use Angular 2 to build Apple TV apps as well.

The Angular 2 ecosystem will only continue to evolve, which is why I said that Angular 2 is a platform earlier. The future is bright for Angular 2 and your future will be bright as well if you hop onboard.

Quantum Computing is Coming

Quantum Computing is Coming

quantum-entanglement

The term Quantum Computing sounds pretty intimidating, powerful and mysterious. What is Quantum Computing? A very basic explanation is it’s a new way of processing information. All existing computer applications and algorithms will not work on a Quantum Computer.

Some of the biggest technology companies in the world have research facilities devoted to this technology. Not only private profit based businesses are interested in this technology. The Chinese government has unlimited resources (no budget) devoted to Quantum Computing research. Besides Cold Fusion, it’s one of the most game changing fields in science and mathematics. The winner, or the first to exploit the power of Quantum Computing, will be in one of the most powerful positions in the world.

How is a Quantum Computer different from our current computers? It’s based on entanglement which is one of the strangest things we have discovered in nature. Basically, two different particles share the same information when they become entangled. So far, we haven’t found any distance that will not allow entanglement to happen. The information sharing also seems to happen instantly, faster than the speed of light.

Microsoft, Google, IBM and other large tech companies, are all investing a lot of money researching Quantum Computing. The types of people they are employing are mostly from academia. Mathematicians, physicists, fields medal winners. Some of the smartest people in the world are working on bringing this technology to the masses. It’s not only the creation of the technology but the algorithms that will run on these machines. Initially, only a few people in the world will have the knowledge to able to write the “code” that runs on a Quantum powered machine.

What’s the difference between a Dell computer sitting in your house compared to a Quantum Computer? It needs to be cooled to near absolute zero to operate is a major difference. It’s not feasible to think that a Quantum Computer will ever be sitting in somebody’s house, or taken to the coffee shop for browsing the Internet. The power will be used for solving extremely complex problems, quickly. An example would be analyzing different molecular combinations for a drug and accurately predicting its outcome. Or being able to process weather and other geological simulations on a massive scale not possible today. It won’t be used to run your Angry Birds apps.

Some have compared Quantum Computing to the industrial revolution. Nobody really understood the full impact of it while it was happening and it changed the world in a radical way. Quantum Computing won’t be affecting your daily life anytime soon. At least not in the next 5-10 years. But the path seems to be in the right direction where we can make another major leap in productivity and knowledge.

Wikipedia on Quantum Computing

Podcast on the current state of Quantum Computing affairs

http://www.livescience.com/52811-spooky-action-is-real.html

http://www.businessinsider.com/quantum-computers-will-change-the-world-2015-4

Dynamic ng-controller with Angular

Dynamic ng-controller with Angular

If you’re dynamically pulling in views through frameworks like jQuery, or anything else that isn’t Angular, Angular won’t know you are referencing the controller on the page without re-compiling.  To let Angular know about your new HTML is pretty straight forward.  I have a method in my Main controller that I can access whenever I need to.

    $scope.activateView = function (elements) {
        $compile(elements.contents())($scope);
        $scope.$apply();
    };

I then have a common function I setup to be able to access anywhere I need in the application.

    function activateAngularController(controllerWrapperId) {
        var wrapperElement = angular.element(document.getElementById(controllerWrapperId));
        var mController = angular.element(document.getElementById("body"));
        mController.scope().activateView(wrapperElement);
    }

My Main controller is defined in the body tag, hence it’s getting it from body to populate the mController variable.

<body ng-app="coolApp" ng-controller="MainController">

And finally when you’re instantiating your HTML dynamically, just have it wrapped in a div you can reference.

<div id="myControllerWrapper">
 <div ng-controller="MyController as my">
     <!-- Your html and angular stuff -->
 </div>
</div>

<script>
activateAngularController('myControllerWrapper');
</script>

And that should do it. Angular now knows about the new HTML and directives within your controller wrapper. Alright Peter man, ouu ouu!

Drawing on a UIView through gestures

Drawing on a UIView through gestures

Screenshot 2015-06-27 09.27.50

 

My latest project on GitHub shows how to draw on the canvas with interaction from different touch gestures.  The project can be found here:  https://github.com/BeanoKing/UIBezierPath-Rotation

All that needs to be done on the view in order to interact with gestures is to add the UIGestureRecognizerDelegate to your header file.

#import <UIKit/UIKit.h>

@interface CustomView : UIView <UIGestureRecognizerDelegate>

@end

Now in your implementation you have access to these events:

- (void) touchesBegan:(NSSet *) touches withEvent:(UIEvent *) event
{
    UITouch *touch = [touches anyObject];
    CGPoint p = [touch locationInView:self];
    
    [self setNeedsDisplay];
}

- (void) touchesMoved:(NSSet *) touches withEvent:(UIEvent *) event
{
    UITouch *touch = [touches anyObject];
    CGPoint point = [touch locationInView:self];
    
    [self setNeedsDisplay];
}

- (void) touchesEnded:(NSSet *) touches withEvent:(UIEvent *) event
{
    [self setNeedsDisplay];
}

Notice in the began and moved I’m capturing the point on the screen. I’ve left out the implementation of what to do with those points, which you can get from the project on GitHub.

A key when doing custom drawing is call [self setNeedsDisplay] whenever you want to re-draw the UI.  This will call the drawRect method where all of you drawing code needs to go.  The entire UI will be redrawn when this is called, so anything you want to stay on the UI needs to be here.

- (void)drawRect:(CGRect)rect {
   CGContextRef context = UIGraphicsGetCurrentContext ();
    CGContextSaveGState (context);
    
    const CGFloat *components = CGColorGetComponents([UIColor blueColor].CGColor);
    [[UIColor colorWithRed:components[0] green:components[1] blue:components[2] alpha:0.7] setFill];
    
    [[UIColor blueColor] setStroke];
    
    rectangle = CGRectMake(200,300,300,300);
    
    rectPath = [UIBezierPath bezierPathWithRect:rectangle];
    
    [rectPath setLineWidth:2.0f];
    [rectPath stroke];
    
    [rectPath fill];
    
    CGContextRestoreGState (context);
}

This code is drawing a big blue rectangle with dimensions of 300×300 with a lighter blue fill and a blue outline. It’s also being placed at the 200 x coordinate and the 300 y coordinate. 0 is the upper left corner on the UIView canvas.

A lot of the implementation and logic can be found in the GitHub project. This was just a high level overview of how to draw on a UIView and capture gestures. If you have any questions on the project leave a comment and I’ll get back to you.

Dynamic form validation with ng-form

Dynamic form validation with ng-form

Validation with ng-form isn’t difficult, once you get the hang of it. But figuring out some of the advanced features isn’t that trivial. One pretty common scenario a web application needs to do is dynamic form validation. Dynamic meaning the HTML isn’t pre-defined, but is dependent on a condition, which is one of the things that make Angular so great in the first place!

Let’s say you have a ng-repeat and you include a ng-form within that loop. The ng-form needs a unique name.

Example:

<div ng-repeat=”arg in inputArray”>

<ng-form name=”form{{arg.Name}}”>

<div ng-show=”form{{arg.Name}}.$valid == false”>

{{arg.validationMessage}}

</div>

</ng-form>

</div>

The name is unique within the inputArray object collection. The validationMessage property is defined on the arg object, and when the validation isn’t met, the message will display.

For example when you’re iterating through the inputArray you can place a ng-required on an input tag.

<input ng-required=”arg.required()” />

Have required as a function works great because you can put logic in the function to support conditional logic. If there’s not conditonial logic you can simply return true.

Angular form validation works really well once you understand how it works. Give it some time and you’ll really like working with it.

Capturing Text from Dictation in iOS

Capturing Text from Dictation in iOS

Users of your app can tap that little microphone key on the left side of the keyboard to speak what they would normally type in. A handy feature for users, but difficult to track in your application. While the UITextView or UITextField text property gets populated, it doesn’t call the standard delegates such as shouldChangeCharactersInRange.

How to capture the text from a UITextField:

Add this to your awakeFromNib or viewDidload method

[self.textField addTarget: self
action: @selector(eventEditingChanged:)
forControlEvents: UIControlEventEditingChanged];

The implementation of the eventEditingChanged method

-(void)eventEditingChanged:(UITextField *)sender {

if (sender.text.length <= 0) {
return;
}

UITextInputMode *inputMode = sender.textInputMode;
NSString *modeIdentifier = [inputMode respondsToSelector:@selector(identifier)] ? (NSString *)[inputMode performSelector:@selector(identifier)] : nil;

if([modeIdentifier isEqualToString:@"dictation"])
{
//do something with sender.text
}
}

How to capture the text from a UITextView:

Add this to your awakeFromNib or viewDidLoad method

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(modeChange:) name:UITextInputCurrentInputModeDidChangeNotification object:nil];

The implementation of the modeChange method

-(void)modeChange:(NSNotification *)notification
{
NSString *inputMethod = self.textView.textInputMode.primaryLanguage;

if (inputMethod != nil) {
if (![inputMethod isEqualToString:@"dictation"]) {
//do something with self.textView.text
}
}
}

And finally remove the observer

- (void) dealloc
{
[[NSNotificationCenter defaultCenter] removeObserver:self];
}

Now your users will not be stifled from their laziness of not typing. Enjoy capturing that dictated text.

Implementing Base 36 in your iOS App

Implementing Base 36 in your iOS App

Base 36 can be a great way to represent a number with up to 8 alpha numeric characters. If you have an auto number and don’t want to represent it with just a number, you can convert it Base 36 to make it look a little sexier. Wikipedia has a great overview of the technology Base 36.

To convert a number to Base 36 you can use this c based algorithm

static char *base36enc(long unsigned int value)
{
	char base36[37] = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ";
	/* log(2**64) / log(36) = 12.38 => max 13 char + '\0' */
	char buffer[14];
	unsigned int offset = sizeof(buffer);
 
	buffer[--offset] = '\0';
	do {
		buffer[--offset] = base36[value % 36];
	} while (value /= 36);
 
	return strdup(&buffer[offset]);
}

Calling this from Objective-C and get the string value

char *wo = base36enc(someint)
NSString *stringValue = [NSString stringWithUTF8String:wo];

There you go. Pretty simple to use in your Objective-C app.

The Podcasts I Roll With

The Podcasts I Roll With

podcast

My semi-daily commute to downtown Denver is usually occupied by sports talk radio, although I do love to listen to a few different podcasts each week. I thought I would share the ones I like and get some feedback on some other podcasts I haven’t heard of as well.

I am mainly interested in iOS and mobile related podcasts. I also love Iowa State basketball so I do listen to the Cyclone Fanatic podcast.

a16z
a16z

a16z may be my favorite podcast. It’s not development related, but the interviews they hold with business and technology leaders around the bay area is always interesting.

nsbrief
NSBrief

NSBrief focuses specifically on iOS development. The interviews are mostly with independent iOS developers with each interview covering a different framework or aspect of iOS development.

angular
Adventures in Angular

AngularJS is the javascript framework of choice when I do web development. There is usually great discussions and interviews in all of their episodes.

apptodate
App-To-Date

Weekly episodes about the hottest new apps and what’s happening with the popular apps such as Snapchat, Facebook and others.

debug
DEBUG

In depth interviews with former and current Apple employees. I’ve gained a lot of insight about how things within Apple work from this podcast.

ideveloper
iDeveloper Podcast

They recently put out a podcast on developing an app with Swift and the challenges that came along with that.

Keeping up with these podcasts isn’t that difficult as some are not as frequent publishers as others. a16z has probably been the most consistent at 1 or 2 a week but most of the others are maybe 1 or 2 a month. If you’re a developer or technology enthusiast then I’m sure you’ll find a lot of values in these podcasts.