Category: iOS Development

Drawing on a UIView through gestures

Drawing on a UIView through gestures

Screenshot 2015-06-27 09.27.50


My latest project on GitHub shows how to draw on the canvas with interaction from different touch gestures.  The project can be found here:

All that needs to be done on the view in order to interact with gestures is to add the UIGestureRecognizerDelegate to your header file.

#import <UIKit/UIKit.h>

@interface CustomView : UIView <UIGestureRecognizerDelegate>


Now in your implementation you have access to these events:

- (void) touchesBegan:(NSSet *) touches withEvent:(UIEvent *) event
    UITouch *touch = [touches anyObject];
    CGPoint p = [touch locationInView:self];
    [self setNeedsDisplay];

- (void) touchesMoved:(NSSet *) touches withEvent:(UIEvent *) event
    UITouch *touch = [touches anyObject];
    CGPoint point = [touch locationInView:self];
    [self setNeedsDisplay];

- (void) touchesEnded:(NSSet *) touches withEvent:(UIEvent *) event
    [self setNeedsDisplay];

Notice in the began and moved I’m capturing the point on the screen. I’ve left out the implementation of what to do with those points, which you can get from the project on GitHub.

A key when doing custom drawing is call [self setNeedsDisplay] whenever you want to re-draw the UI.  This will call the drawRect method where all of you drawing code needs to go.  The entire UI will be redrawn when this is called, so anything you want to stay on the UI needs to be here.

- (void)drawRect:(CGRect)rect {
   CGContextRef context = UIGraphicsGetCurrentContext ();
    CGContextSaveGState (context);
    const CGFloat *components = CGColorGetComponents([UIColor blueColor].CGColor);
    [[UIColor colorWithRed:components[0] green:components[1] blue:components[2] alpha:0.7] setFill];
    [[UIColor blueColor] setStroke];
    rectangle = CGRectMake(200,300,300,300);
    rectPath = [UIBezierPath bezierPathWithRect:rectangle];
    [rectPath setLineWidth:2.0f];
    [rectPath stroke];
    [rectPath fill];
    CGContextRestoreGState (context);

This code is drawing a big blue rectangle with dimensions of 300×300 with a lighter blue fill and a blue outline. It’s also being placed at the 200 x coordinate and the 300 y coordinate. 0 is the upper left corner on the UIView canvas.

A lot of the implementation and logic can be found in the GitHub project. This was just a high level overview of how to draw on a UIView and capture gestures. If you have any questions on the project leave a comment and I’ll get back to you.

Capturing Text from Dictation in iOS

Capturing Text from Dictation in iOS

Users of your app can tap that little microphone key on the left side of the keyboard to speak what they would normally type in. A handy feature for users, but difficult to track in your application. While the UITextView or UITextField text property gets populated, it doesn’t call the standard delegates such as shouldChangeCharactersInRange.

How to capture the text from a UITextField:

Add this to your awakeFromNib or viewDidload method

[self.textField addTarget: self
action: @selector(eventEditingChanged:)
forControlEvents: UIControlEventEditingChanged];

The implementation of the eventEditingChanged method

-(void)eventEditingChanged:(UITextField *)sender {

if (sender.text.length <= 0) {

UITextInputMode *inputMode = sender.textInputMode;
NSString *modeIdentifier = [inputMode respondsToSelector:@selector(identifier)] ? (NSString *)[inputMode performSelector:@selector(identifier)] : nil;

if([modeIdentifier isEqualToString:@"dictation"])
//do something with sender.text

How to capture the text from a UITextView:

Add this to your awakeFromNib or viewDidLoad method

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(modeChange:) name:UITextInputCurrentInputModeDidChangeNotification object:nil];

The implementation of the modeChange method

-(void)modeChange:(NSNotification *)notification
NSString *inputMethod = self.textView.textInputMode.primaryLanguage;

if (inputMethod != nil) {
if (![inputMethod isEqualToString:@"dictation"]) {
//do something with self.textView.text

And finally remove the observer

- (void) dealloc
[[NSNotificationCenter defaultCenter] removeObserver:self];

Now your users will not be stifled from their laziness of not typing. Enjoy capturing that dictated text.

Implementing Base 36 in your iOS App

Implementing Base 36 in your iOS App

Base 36 can be a great way to represent a number with up to 8 alpha numeric characters. If you have an auto number and don’t want to represent it with just a number, you can convert it Base 36 to make it look a little sexier. Wikipedia has a great overview of the technology Base 36.

To convert a number to Base 36 you can use this c based algorithm

static char *base36enc(long unsigned int value)
	char base36[37] = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ";
	/* log(2**64) / log(36) = 12.38 => max 13 char + '\0' */
	char buffer[14];
	unsigned int offset = sizeof(buffer);
	buffer[--offset] = '\0';
	do {
		buffer[--offset] = base36[value % 36];
	} while (value /= 36);
	return strdup(&buffer[offset]);

Calling this from Objective-C and get the string value

char *wo = base36enc(someint)
NSString *stringValue = [NSString stringWithUTF8String:wo];

There you go. Pretty simple to use in your Objective-C app.

iOS 8 API. A look at Home Kit

iOS 8 API. A look at Home Kit


Home Kit is another exciting new framework being release with iOS 8 in the Fall of 2014. If you’d like to read the overview from Apple, they’ve posted an article explaining the high level features. Home Kit is similar to Health Kit in that the data is stored in a central database for other apps to interact with. I’ll be getting into some of the specifics of the API and an overview of the new Home Kit Accessory Simulator.

The main idea of Home Kit is to communicate with physical devices through an iOS device (iPod, iPhone, iPad). Siri is integrated into Home Kit so you can control physical devices through voice commands. It will be interesting to see all of the devices that will be integrated with Home Kit in the coming years. This will be a huge selling point for Apple going forward as more and more manufacturers come on board.

Here’s how the object hierarchy works: Home (HMHome) -> Rooms (HMRoom) -> Accessories (HMAccessory) -> Services (HMService). Rooms can also be grouped into zones (HMZone).

  • Home:
    A home is meant to be a single physical location. A user may have multiple homes set up on their device.
  • Rooms:
    A room is meant to be used as a grouping for a devices (accessories). An example is if you wanted all of your lights in your family room to be turned on or off at once, you could tell Siri “Turn on the lights in the family room”.
  • Accessories:
    Accessories are part of a home, but can also be assigned to a room in that home. An accessory is a physical device that is specially built to communicate with the Home Kit API. Your oven may be built to interact with the Home Kit API, so you could say “Preheat the oven to 350” (Which would be wicked cool).
  • Services:
    Services are different functions of an accessory. So an oven would have multiple features like heating, setting a timer, and turning the oven light on/off.
  • Zones:
    Rooms can be grouped into zones, so you could group all of your rooms in the basement into the basement zone. At that point you can command a function for all accessories in a collection of rooms such as “Turn off the lights in the basement”.

Let’s take a look at the Home Kit Accessory Simulator. In Beta 1 of XCode 6 you can add accessories for the HMAccessoryBrowser to find. I assume updates in the future will also allow developers to add services to interact with as well.

let browser = HMAccessoryBrowser()

Screenshot 2014-06-16 13.54.04

Once your code identifies accessories in the home, you can then choose to add them to a room.

When Beta 2 is released I’ll expand on this article and show some concrete code examples.

After reviewing the Home Kit API, this could potentially be a really big deal for Apple. This will help take home automation to the next level making it mainstream in the next 3-5 years.

iOS 8 API. A look at Health Kit

iOS 8 API. A look at Health Kit

Health Kit is introduced with iOS 8 as part of the new HealthKit.framework. I created a new project to play around with the new functionality available to developers. The most interesting class to developers will be the HKHealthStore class which will allow the sharing of the user’s health data to other apps. The user chooses which data will be accessible to other applications.

The first thing you check for is the bool property isHealthDataAvailable. That will check if Health Kit is available on the device.

let dataAvailable = HKHealthStore.isHealthDataAvailable()

Health Kit allows the developer to write data to the Health Kit data store. Any data written by the app can be accessed for read and write access whenever it’s needed.

Before an app can access data, it must request access. 3 different types of access include BiologicalSex, BloodType and DateOfBirth:


This is a simple introduction to the API as it has much more functionality for developers including the ability to write queries against the data.

Health Kit is sure to be a hit with iOS users as health related apps are already very popular in the app store. Being able to centralize a users data will add a lot of cohesiveness among apps which will bring a better user experience to iOS users going forward.

Apple advances tools and security to help keep their dominant position in the enterprise and education markets.

Apple advances tools and security to help keep their dominant position in the enterprise and education markets.

With Blackberry bleeding cash and recently laying off over 4,000 employees, they no longer have the privilege or owning the enterprise market. Blackberry’s share is so small in the enterprise now, they don’t even show up on most new activations charts. Blackberry’s CEO, John Chen, has given the company a 50/50 chance of survival. Enterprise needs a stable and dependable mobile technology partner. While Blackberry originally owned this market, Apple has made huge inroads in the last few years. In Q4 of 2013 mobile device activations favored Apple with a 73% market share for iPhones and over 91% for tablets with the iPad and iPad Mini. Tim Cook, CEO of Apple, has publicly stated how important the enterprise market is and how the company will continue to invest heavily in this space.

In February of 2014, Apple announced welcome changes to enterprise deployment, management and security. Apple also made it easier on students by allowing them to sign up for their own Apple IDs if the student is under the age of 13. The under 13 IDs require parental support to sign up and Apple also sets limitations on what services are available.

Apple has updated their Enterprise Deployment Overview document which explains how devices can deployed along with integrating personal devices as hybrid approach. Being able to support BYOD (bring your own device) is very important for enterprise deployment, seeing as how many employees are already carrying a mobile device with them. In a BYOD environment users can enroll in their corporate mobile device management (MDM) program by initiating the installation of a configuration profile on their device. If the owner of the device leaves the company, they can simply remove the profile. The devices can then be provisioned for apps and remotely managed just like a company purchased MDM device.

With employees reluctant to carry a Windows Phone, and Android still being having the perception of not being a secure platform, Apple should continue to dominate the enterprise market. They are definitely making it easy on corporations allowing volume discounts, purchase orders and devices configured before they are shipped. Apple’s focus on security only strengthens the argument to deploy/manage iOS devices for its employees.

This post was originally written for the Aspenware blog

Keystone Laboratories iOS App Released

Keystone Laboratories iOS App Released

Keystone Laboratories, Inc. has just released it’s first mobile app in the iTunes store. The app is free and is currently available for download on iPhones running iOS 7.0 and higher. Customers will be able to view their sample results through the Keystone Laboratories app once they have a username and password set up with their project manager. Josh King, the developer of the application, states that “This is just the beginning for getting your sample data to your mobile device. I’d love to hear feedback on the current version and what you’d like to see in the future. I have a first draft of an iPad application as well.”

Jeff King, Laboratory Directory, exclaims “This app will go along way with adding value to the customer experience. Josh has done a great job developing the app in a short amount of time.”

To check out the app for yourself, view it in the iTunes store. Keystone Laboratories – Josh King

Debugging Complex iOS View Issues

Debugging Complex iOS View Issues

At certain times throughout a software developer’s career, some things just don’t make sense. All of a sudden something stops working the way it used to, and you have no idea why. This was the case when a button my UITableViewCell was no longer responding to any touch events. I checked the IBOutlet and IBAction from my UIButton 4 or 5 times to see if they were set up correctly. I even re-did the bindings just to make sure they were OK. My breakpoint was not being hit when I tapped the button. I was perplexed and stumped! As George Costanza would say, “These pretzels… are making me thirsty!”

I remembered a tool demoed at the Denver iOS Meetup back in January to debug complex iOS view issues. The tool is called Spark Inspector. Spark Inspector allows you to see the class hierarchy of any view you are currently in when running the simulator. It also allows you to play with the properties of the class to experiment having different options turned on/off.

The following is a screenshot of the view I was having trouble with. As you can see Spark Inspector has an option to view your view hierarchy is 3D, which is awesome.


As you can see, there is an unwanted view sitting on front my button I want to be able to tap. The button is the star in the upper right and bottom section of the view. The view sitting in front of everything is a class called UITableViewCellContentView. From what I’ve read this class is new in iOS7 to help certain interactions such as found here.

After messing around with the properties of the class in Spark Inspector, I found that setting the Alpha to 0 will allow me to interact with my button again! This was a great discovery, but actually solving the problem in my code was not trivial. Within the class that loads my UITableViewCell from a xib, I got into the sub views until I find the culprit and then set the Alpha to 0.

    NSArray *subviews= [self subviews];

    NSLog(@"Subviews count: %d",subviews.count);

    for (UIView *view in subviews)
        NSLog(@"CLASS: %@",[view class]);
        NSArray *otherViews = [view subviews];
        NSLog(@"Sub-Subviews count: %d",otherViews.count);

        if (otherViews.count > 1) {
            if ([otherViews[1] respondsToSelector:@selector(setAlpha:)])
                [otherViews[1] setAlpha:0];

This did the trick. There may be a more elegant to solve this problem, but I thought I’d share this solution with others if you come across a problem such as this one. I’d like to hear any other suggestions on debugging iOS view issues if you’ve had a similar experience.