define('DISALLOW_FILE_EDIT', true); unFocus Projects – Kevin Newman and Ken Newman http://www.unfocus.com Home of Scripts 'n Styles for WordPress, Backstage2D and History Keeper! Wed, 13 Jan 2016 18:21:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.5 Quint Dice update – Facebook and profile http://www.unfocus.com/2016/01/13/quint-dice-update-facebook-and-profile/ http://www.unfocus.com/2016/01/13/quint-dice-update-facebook-and-profile/#respond Wed, 13 Jan 2016 18:18:04 +0000 http://www.unfocus.com/?p=4468 Continue reading "Quint Dice update – Facebook and profile"]]> An update on Quint Dice. I turned on all the Facebook registration and login stuff that’s actually been done for a while, but I had off for a few reasons. Users may now sign up with their Facebook account, to make it easier to get started. Adding Facebook support has been one of the bigger time sinks in this project. Here’s a list of what’s done and not, so you’ll see why.

Done:

  • Facebook accounts – done!
  • Facebook link accounts (if you have a password based account, you can add Facebook to it, then log in with your FB account in the future).

Not Done:

  • Facebook Canvas – as I mentioned in an earlier post, it looks like ads are a problem within Canvas. I still have to figure out what to do about that.
  • Facebook notifications – The setting is available, so you may opt-out (it’s on by default) but I haven’t set it up to send notifications yet. It looks like I need to have a return URL within a Canvas app to make it work, but I’m not certain, and need to read the FB docs more carefully.
  • Native Facebook in Cordova for Android and iOS. This is turning into an endless headache. There are at least 3 Meteor packages that promise to do this, but each needs a Cordova plugin that I just can’t seem to build (a third one that I’m not sure how to install). I had hoped this would be much easier (and I’m actually pretty sure I once had it working), but it’s costing me a lot of time, so I’m probably going to let it lie for a while.
  • A user who registers with Facebook will have no username – I need to make sure they enter one, since that’s the primary way we do user discovery ATM.
  • No friends list, which is an important feature in my mind – starting games and inviting folks to play.
  • Account merging – if a user signs up one way (password) then signs up again another way (Facebook) then wants to link their accounts, they should be able to. There is a Meteor package (or 2) for this, but I have to implement some glue for it.

Other updates in this round are better styling for forms, and the ability to set other profile data, such as your email address (useful if you forget your password!). You can also change your username if you want, and opt-out of push notifications on mobile.

More soon!

]]>
http://www.unfocus.com/2016/01/13/quint-dice-update-facebook-and-profile/feed/ 0
Introducing Quint Dice! http://www.unfocus.com/2016/01/09/introducing-quint-dice/ http://www.unfocus.com/2016/01/09/introducing-quint-dice/#respond Sat, 09 Jan 2016 19:54:53 +0000 http://www.unfocus.com/?p=4459 Continue reading "Introducing Quint Dice!"]]> I’ve been working on a game for the last few months in spare time. It’s a dice game called Quint Dice, a social dice game. What makes this different from so many dice games out there is that it’s based on dice that have color pairs, and you can play it with more than two players.

What I’d like to avoid with Quint Dice is pay to win forms of revenue. Currently there are no bonus rolls. I’ll eventually add some, but you won’t be able to buy them or stock pile them to gain advantage over your opponents. I think I’ll add one bonus roll per game, and if you don’t use it in that game it goes away. I may also add a second bonus type – an extra dice. The idea is to add a level of strategy and flexibility to the play, without allowing a fundamental shift in advantage for one player or another just because they paid for an advantage.

The only revenue source built into the game at launch is a small banner ad at the bottom. I’d also like to add custom dice packs, and maybe some full on themes. I’m hoping this will be enough to turn this into something that pays for itself. I may also play with interstitial ads, but only as a voluntary way to earn points to buy custom dice packs and themes without shelling out cash, for users who prefer that route. I like this better than pestering players with involuntary interstitial ads as a way to get them to pay. Annoying players is not my favorite model, no matter how common it is in mobile gaming. Finally, there will eventually be an option to remove the ads.

I built Quint Dice with Meteor and React, and I would like to eventually port to React Native, but I’m using Cordova for the time being on Android and iOS (soon!). Like so many of the projects I play with under the unFocus banner, this has mostly been a learning exercise. But I’m happy with the results, and thought I should probably dust off this blog, and may start to share some thoughts I have as I develop these things.

To kick that off, I’ll share a couple of things I learned while getting this out the door, in no particular order. If you’d like to know more about any of these items, please leave a comments, and I’ll see about writing a follow-up post.

  • Facebook integration is easy/hard. Getting notifications to work seems pretty easy, but getting a canvas app to work posses some challenges, particularly where advertising is involved. You can’t use AdSense inside an iframe, which is needed for FB canvas. Instead you’ll need to go with one of Facebook’s approved ad vendors. They all have that institutional feel to them, if their websites can even be reached. Not a fantastic dev experience. The solution I’ll probably go with is to create a Facebook canvas based landing page, and then flow my users to my domain from there, instead of having them play within the canvas page.
  • Meteor’s accounts system is awesome! With very little effort you can get up and running with a full accounts system, and there are a ton of Meteor packages to expand functionality. I ended up building custom UI in the end, but to get started I used the off the shelf accounts-ui, so I didn’t have to wait. I’ll probably be using a link accounts package to add the ability to associate facebook, google+ and maybe other third-party accounts services (Amazon perhaps) to existing Quint Dice accounts. I may also use an account merge package to make it so users can merge their accounts if they accidentally sign up with two different auth sources and want to combine their accounts into one. There are two different packages for that – and these are the kinds of things that make Meteor so fantastic! I can’t think of another platform where something like that is so easy to set up. Setting this up has some interesting challenges in terms of user flow, and it’s probably worthy of a blog post or two.
  • My on boarding process is a mess in the current iteration. I hope to fix that with the above mentioned packages link and merge packages. I may also play around with having an anonymous user for anyone who comes to the site and is not logged in. That way they can just get started.
  • Finding players is another messy area so far. I basically only collect one bit of information from users – a username. To start a game with other players, you are presented with a giant list of every player. This clearly needs work. Eventually I’d like to add Facebook friend support, and maybe even a friends list internal to Quint Dice. I’ll also add more profile data and some way to search on that (this is on my short list).
  • Push notifications are relatively easy to set up on Android. Relatively more complicated on iOS, but I should have that out soon (this is the only thing hold up an iOS release). I did figure out how to get a nice black and white notification icon to work, and that maybe warrants its own blog post (see this Meteor forums post for now). I’m using raix:push package in Meteor for that.
  • Meteor’s React support is build around a React Mixin, which basically wraps a single method on a component to make it reactive. This makes sense given that Meteor typically doesn’t enforce any kind of application architecture on the developer (a good thing IMHO), but I will probably switch to using something more Flux like. For non-reactive data sources and application state, I’m already using a Flux like pattern/architecture (using SignalsLite.js), but I may look into something like Reflux (or maybe Redux, or Alt) and then figure out how to move my reactive Meteor handling to that level. This probably warrants a blog post or two.
  • I used Adobe Animate CC to create the animated dice roller (output as HTML5 Canvas of course). CreateJS is pretty sweet, even on mobile. I may experiment with OpenFL for new dice packs, and see how well that runs. I’m thinking that custom dice packs will stay in HTML5, even if I eventually transition to React Native, so that they can be truly cross-platform. The only challenge with that might be an eventual port to Apple Watch, and AppleTV, which don’t support WebViews. I’m curious though if there is a way to use the JS behind my canvas based mini-apps, and render that through some HTML5 canvas wrapper from within a JavaScriptCore instance (is JSCore on Apple WatchOS and Apple TVOS?). When I figure this out, I’ll almost certainly blog about it. Of course, I may not even need all that if I go with OpenFL, because they have a native c++ compiler.

Going forward, I’ll try to post more, probably when I make an update. There are a ton of other important packages (aldeed:simple-schema and aldeed:collection2) and technologies to cover, and I’m sure I’ll mention them eventually.

]]>
http://www.unfocus.com/2016/01/09/introducing-quint-dice/feed/ 0
HiDPI/Retina for CreateJS (Flash Pro HTML5 Canvas) http://www.unfocus.com/2014/03/03/hidpiretina-for-createjs-flash-pro-html5-canvas/ http://www.unfocus.com/2014/03/03/hidpiretina-for-createjs-flash-pro-html5-canvas/#comments Tue, 04 Mar 2014 00:25:12 +0000 http://www.unfocus.com/?p=4365 Continue reading "HiDPI/Retina for CreateJS (Flash Pro HTML5 Canvas)"]]> Adding HiDPI and Retina screen support to a CreateJS (Flash HTML5 canvas publish with EaselJS) is easy enough. Just add this code after where the stage is defined in your published html file (either inside the generated init function, or handleComplete if there are external assets to load):

[js]
if (window.devicePixelRatio) {
// grab the width and height from canvas
var height = canvas.getAttribute(‘height’);
var width = canvas.getAttribute(‘width’);
// reset the canvas width and height with window.devicePixelRatio applied
canvas.setAttribute(‘width’, Math.round(width * window.devicePixelRatio));
canvas.setAttribute(‘height’, Math.round( height * window.devicePixelRatio));
// force the canvas back to the original size using css
canvas.style.width = width+"px";
canvas.style.height = height+"px";
// set CreateJS to render scaled
stage.scaleX = stage.scaleY = window.devicePixelRatio;
}
[/js]

IE 10 doesn’t support devicePixelRatio, but you can get a resonable devicePixelRatio with this (include it before your CreateJS script, and call window.getDevicePixelRatio() instead of using the standard property):

[js]
/*! GetDevicePixelRatio | Author: Tyson Matanich, 2012 | License: MIT */
(function (window) {
window.getDevicePixelRatio = function () {
var ratio = 1;
// To account for zoom, change to use deviceXDPI instead of systemXDPI
if (window.screen.systemXDPI !== undefined && window.screen.logicalXDPI !== undefined && window.screen.systemXDPI > window.screen.logicalXDPI) {
// Only allow for values > 1
ratio = window.screen.systemXDPI / window.screen.logicalXDPI;
}
else if (window.devicePixelRatio !== undefined) {
ratio = window.devicePixelRatio;
}
return ratio;
};
})(this);
[/js]

If you like to load your CreateJS based animation in an iframe (I do) and the canvas content is the only content, you may also want to add some styles to avoid scrollbars and extra padding:

[css]
body {
margin:0;
padding:0;
}
canvas {
display: block;
}
[/css]

Before making these edits, I recommend copying the main html page, so that you don’t have to worry about the publisher wiping out your changes when you publish again (if certain things change, you may need to reintegrate your changes).

Some Notes:

  • Some (slightly) older versions of Safari on OSX with Retina screens seemed to automatically apply a pixel doubling to canvas elements, so this might be redoubling again (I’m really not sure). It doesn’t seem to do this in the latest version though.
  • If you use “cacheAsBitmap” the content will be cached at the normal resolution. If you can find the place in code where that’s being set, you can actually apply the devicePixelRatio multiplier there by passing it in as a variable to the function that sets CAB, but CreateJS does not do this by default.

Relevant info for devicePixelRatio and supporting HiDPI / Retina displays

]]>
http://www.unfocus.com/2014/03/03/hidpiretina-for-createjs-flash-pro-html5-canvas/feed/ 5
SVGView for Xamarin.Android http://www.unfocus.com/2014/02/19/svgview-for-xamarin-android/ http://www.unfocus.com/2014/02/19/svgview-for-xamarin-android/#comments Wed, 19 Feb 2014 23:39:58 +0000 http://www.unfocus.com/?p=4342 Continue reading "SVGView for Xamarin.Android"]]> Instead of making dozens of PNG files for all the various screen sizes for icon assets, I wanted to use vector graphics (such as SVG) in an Android app I’m building with Xamarin.Android. There is a tool for generating images, and that’s better than nothing, but SVG is even easier, and I’m all about easier. I thought this would be relatively easy to do, but it turns out Android has no built in support for vector image formats.

Xamarin has a nice binding project and sample for using an SVG library (I think it wraps SVG-Android on GitHub) in Android apps, but it wasn’t clear how to use that, and there was an annoying gotcha I hit along the way, that I thought I’d document here.

There are two projects in the sample solution. One is the library project, and the other is a sample project, with sample art that you can build to see it working. What we want to do is build the library, and then copy the necessary components into our own Android app project. Here’s how to do that using Xamarin Studio.

  1. Download and unzip the project files from Xamarin (or fork it on GitHub). Open the Solution in Xamarin Studio. You should see something like this:
  2. Put the build mode into “Release” and then right click (or control click) on SvgAndroid (Highlighted in the screenshot above), and then click Build SvgAndroid (or highlight the project and press cmd+K). This will make a release .dll file (a .NET assembly) in the bin folder:
  3. You’ll need to copy two files into your own project. SvgAndroid.dll from bin/Release, and svg-android-1.1.jar from the Jars folder. I put SvgAndroid.dll file in the project/Assemblies folder in in my project hierarchy so that the dll could be managed in git with the rest of my project (the git rule of thumb – include what you need to build, and I need this dll to build the app). The jar file – svg-android-1.1.jar – went into the project/Jars folder.
  4. Add the assembly to the project: Right click on references in the Solution panel, and choose “Edit References.” In there, add the .dll under the “.Net Assemblies” tab.
  5. Add the jar file: Add the jar file to your project using add file or add folder (to add the entire Jars folder). Then right click the jar file, and choose “Build Action” -> “AndroidJavaLibrary” to make sure it gets packaged with your application.

That’s it! Those two files are all you need. Now you can create an SVGView class, and use that in your axml layouts. Here’s a quick and dirty example of the class:

[csharp]using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Android.App;
using Android.Content;
using Android.OS;
using Android.Runtime;
using Android.Util;
using Android.Views;
using Android.Widget;
using Com.Larvalabs.Svgandroid;

namespace unFocus
{
public class SVGView : ImageView
{
protected string svgSrc;
public string SVGSrc {
get {
return svgSrc;
}
set {
svgSrc = value;
setupSVG ();
}
}

public SVGView (Context context) :
base (context)
{
Initialize ();
}

public SVGView (Context context, IAttributeSet attrs) :
base (context, attrs)
{
Initialize (attrs);
}

public SVGView (Context context, IAttributeSet attrs, int defStyle) :
base (context, attrs, defStyle)
{
Initialize (attrs);
}

void Initialize ()
{
}

void Initialize (IAttributeSet attrs)
{
var a = Context.ObtainStyledAttributes(attrs, Resource.Styleable.SVGView);
SVGSrc = a.GetString(Resource.Styleable.SVGView_svgSrc);
a.Recycle ();
}

void setupSVG ()
{
// svg-android doesn’t work in hardware mode, so set software
SetLayerType (LayerType.Software, null);

if (null == SVGSrc)
return;
SVG svg = SVGParser.GetSVGFromAsset (Context.Assets, SVGSrc);
SetImageDrawable (svg.CreatePictureDrawable ());
Invalidate ();
}
}
}[/csharp]

And in values/attrs.xml:

[xml]<?xml version="1.0" encoding="UTF-8" ?>
<resources>
    <declare-styleable name="SVGView">
        <attr name="svgSrc" format="string"></attr>
    </declare>
</resources>[/xml]

Now you can create SVGViews in axml using:

[xml]<unFocus.SVGView
   android:svgSrc="svg/somesvgfile.svg"
   android:layout_width="58.0dp"
   android:layout_height="58.0dp"
   android:id="@+id/icon" />[/xml]

SVG files go in the Assets folder, in whatever tree you want. In this example, they are in Assets/svg/. Build action for svg files is “AndroidAsset,” which should be the default.

There is an irritating gotcha, that’ll have you tearing your hair out if you don’t know about it.

SVG files MUST have width and height attributes on the root element to work with this library. If you don’t have them, and Adobe Illustrator CC doesn’t add them by default, the lib will fail with cryptic error messages. The fix is easy enough, just open the SVG in Xamarin and add width and height attributes. There will already be a viewBox attribute with the correct attributes (viewBox=”0 0 70 70″ <– the second two, width and height). You’ll need to add these: width=”70px” height=”70px”.

Update: One other thing I forgot to mention – this didn’t work on the Xamarin Alpha for some reason. The SVGAndroid binding was failing if I remember correctly (at least that’s where the runtime  errors seemed to originate). So if you are having trouble getting this to work, it might be something in the Alpha channel.

Update 2: What I’ve showed you here will work, but some folks on the Xamarin forums suggest there may be advantages to including third party code in alternative ways (like including an entire project, etc.). Have a read.

Update 3: SVG-Android won’t work under hardware acceleration, which is enabled by default in apps targetting Android 3.0 and newer. You just get a blank space. The easiest way around this is to set the Application, Activity, or the specific view you are working on to use software acceleration.

I modified the SVGView example above to do this automatically, but you can also do it yourself (using the code above) or by setting the android:layerType=”software” on the specific view (or somethings its parent).

This is worth knowing about because other types of drawables (such as animations) seem to display other types of incompatibilities with hardware acceleration (such as fuzzy low resolution renderings), and setting software mode can fix it.

Enjoy!

]]>
http://www.unfocus.com/2014/02/19/svgview-for-xamarin-android/feed/ 4
Compiling Mono (and PlayScript) on OSX Mountain Lion (10.8) http://www.unfocus.com/2013/10/10/compiling-mono-and-playscript-on-osx-mountain-lion-10-8/ http://www.unfocus.com/2013/10/10/compiling-mono-and-playscript-on-osx-mountain-lion-10-8/#respond Thu, 10 Oct 2013 15:26:34 +0000 http://www.unfocus.com/?p=4304 Continue reading "Compiling Mono (and PlayScript) on OSX Mountain Lion (10.8)"]]> Quick update/note: Zynga has pulled the source for PlayScript, and no one has taken it up. Safe to assume it’s a dead project.

A while back, Zynga employees demonstrated a project they are working on called PlayScript, an implementation of AS3 and an ASNext wish-list language they named PlayScript on the Mono Platform. In order to play with it, it’s probably best to compile your own copy, since any binaries they post will quickly get out of date. Wanting to play around with it on my iPhone, I took a stab at compiling the Mono Project on OSX, for use in Xamarin Studio. I used a copy of PlayScript-mono for this post, but these instruction should really apply to any fork of Mono (I think).

The OSX compile instruction page on Mono-Project.com is a bit hard to follow if you are new to this stuff, so I thought I’d write up some more detailed instructions in the hope it would save someone some time.

First, the prerequisites. You’ll need Xcode, and the command line tools. Grab Xcode out of the Mac App Store, and run it. Then go to Preferences under the Xcode menu, then the Downloads Tab/button/icon (organizationally, it’s tab, but it looks like a button with an icon). In there, you should install “Command Line Tools”. This contains some of the stuff you’ll need to build Mono on OSX.

The build instructions on mono-project.com say you’ll need a version of mono installed before you can compile. I had Xamarin and Unity3D installed before I tried to build, and one of them seemed to cover my bases.

There are a couple of prereqs still missing. From various sources, it looks like the make and autoconf tools used to be included with Xcode’s command line package, but they aren’t anymore in OSX 10.8 (Mtn Lion). There are a couple of ways to install them, including building them yourself, but the easiest way I found is to use Brew. Installing Brew is easy enough – copy the ruby command from the brew website, and run it in a terminal.

I always recommend going to the primary source for the correct install method on these kinds of things, so go there, install, and then come back.

Next, use brew to install automake, autoconf, and libtool. You don’t need “sudo” – but don’t worry, brew will complain if you forget. You’ll get a message about libtool being prefixed with a g to avoid conflicts – this didn’t seem to have any unwanted effect for building Mono.

brew install autoconf libtool automake

Note: I had trouble with brew install and libtool on a new mac while writing this post. I didn’t have this problem when I built it on an older iMac. I’m not sure what caused it, but if you get any errors during the brew step (mine was about linking libtool) you can type “brew doctor” in the build directory, and it’ll give you some pointers. My specific problem was that /usr/local/lib wasn’t owned by my user account. The brew suggestion was to “chown” that dir, and rerun the link command for libtool (the step that failed during install), so this was the command I used to fix my brew libtool problem:

sudo chown $USER /usr/local/lib
brew link libtool

There were some other notes about rearranging things in your path for git, etc., but I didn’t bother with any of that.

Now we are ready to compile Mono. The first thing you need to do is download a copy of the Mono source from somewhere. A mainline mono archive or SVN checkout would work, or you could clone a local copy of PlayScript-mono from GitHub, then go inside the folder, which is what I did:

git clone https://github.com/playscript/playscript-mono.git
cd playscript-mono

Now we are up to the configure stage, which is where you will start to run into trouble if you don’t have the proper prerequisites setup. The instructions on mono-project say you can use ./configure from a tar, but I wasn’t able to get that to work from either git or tarball. ./autogen.sh seemed to work from both sources though. Note: the prefix flag is where Mono will be installed when you run “make install” and unless you know what you are doing, you probably don’t want to leave that as the default value. Here’s the bolded warning from mono-project, “It is strongly advised not to install Mono from source in /usr, /usr/local or any other “standard” directories unless you know what you are doing.” I put mine in a directory matching the repo name I’m building from my user directory.

./autogen.sh –prefix=/users/{kevin}/mono-playscript –enable-nls=no
make
make install

This can take a LONG time!

]]>
http://www.unfocus.com/2013/10/10/compiling-mono-and-playscript-on-osx-mountain-lion-10-8/feed/ 0
It’s been a while. http://www.unfocus.com/2013/05/15/its-been-a-while/ http://www.unfocus.com/2013/05/15/its-been-a-while/#respond Wed, 15 May 2013 19:29:39 +0000 http://www.unfocus.com/?p=4277 Continue reading "It’s been a while."]]> It’s a been a while since I posted. What’s up?

I’ve been working on a ton of stuff, and learning 10 new platforms it feels like. There’s a lot of excitement out there in both mobile development and HTML5 spaces. I’ve been deep diving into WordPress, learning CoffeeScript and TypeScript (I even started a Backbone.js implementation in TypeScript and called it CruddyMVC, but I didn’t get far, just the model, and I need to back port some ideas I implemented in a PHP/WordPress version of the same), node.js (even got my first bit of node.js code out in the wild just yesterday), and recently Xamarin, which looks like a great replacement for Flash and AIR for mobile apps (even without Playscript and Zynga’s AS3 language bindings, though those are pretty nice to have). I think I’ll even throw together a video sharing app using Xamarin, and a node.js server, cause that sounds like a great way to spend my nights and weekends. I might even dig back into unBrix (here’s the even less feature complete Android version). I have some ideas on how to keep the late game from getting boring, and it really could use some more features and, you know, levels.

I’ve got some fun things to post on WordPress dev (been doing a lot of WordPress and Backbone.js at work), and some tools I made like OnceForm (make sure to checkout the redux branch), that adcSTUDIO contributed to open source under GPL (I wrote a giant intro post for that a couple of weeks ago, but then Firefox crashed…).

Anywho, just wanted to post .. something. So here it is. I’ll try to post more often about some of this stuff, there’s a lot to talk about!

]]>
http://www.unfocus.com/2013/05/15/its-been-a-while/feed/ 0
Multiple LocalHost Sites http://www.unfocus.com/2012/07/24/multiple-localhost-sites/ http://www.unfocus.com/2012/07/24/multiple-localhost-sites/#comments Tue, 24 Jul 2012 14:50:12 +0000 http://www.unfocus.com/?p=553 Continue reading "Multiple LocalHost Sites"]]> I’ve got XAMPP installed on a Windows 7 machine. I wanted a way to test multiple sites locally.

Set up the local host file

In your host file, add (replace with the domains you want):

[code]127.0.0.1    sub.domain.com.dev
127.0.0.1    www.example.com.dev
127.0.0.1    www.unfocus.com.dev[/code]

UPDATE: Thanks to xip.io the host file edits are optional if you use the style www.example.com.127.0.0.1.xip.io (replace 127.0.0.1 with your actual IP address). This is compatible with Adobe’s Edge Inspect.

Set up XAMPP config files

In \xampp\apache\conf\httpd.conf uncomment the following at about line 140:

[code]LoadModule vhost_alias_module modules/mod_vhost_alias.so[/code]

I’ve used the folder structure of “\xampp\htdocs\www.example.com\web\content\files.php” (for example) and the following config will find the folder with the exact name in the url.

In \xampp\apache\conf\extra\httpd-vhosts.conf:

[hoops name="config"]

Fix DOCUMENT_ROOT

If you are having issues with DOCUMENT_ROOT, create setdocroot.php at “\xampp\” and put the following in it:

[hoops name="fix"]

References:
https://issues.apache.org/bugzilla/show_bug.cgi?id=26052#c27
http://stackoverflow.com/questions/138162/wildcards-in-a-hosts-file
http://postpostmodern.com/instructional/a-smarter-mamp/

]]>
http://www.unfocus.com/2012/07/24/multiple-localhost-sites/feed/ 2
SignalsLite.js, Unit Testing, and Buggy IE http://www.unfocus.com/2012/06/12/signalslite-js-unit-testing-and-buggy-ie/ http://www.unfocus.com/2012/06/12/signalslite-js-unit-testing-and-buggy-ie/#respond Tue, 12 Jun 2012 23:12:54 +0000 http://www.unfocus.com/?p=4217 Continue reading "SignalsLite.js, Unit Testing, and Buggy IE"]]> I decided to finally learn unit testing, so I downloaded QUnit (after looking at the 20,000 different unit testing options), and figured I’d give porting tiny SignalsLite to JavaScript a try, and see how the process goes.

While doing that, I found a crazy IE7/IE8 JS bug, that I’m sure has had me scratching my head in the past. Here is a quick unit test to show the problem:

[sourcecode language=”javascript”]
test( "Basic Requirements", function testReqs() {
expect(1);
var T;
(function makeT() {
T=function T(){}
T.prototype.test = 1;
})();
ok((new T).test, "Instance of exported T should have prototype methods");
});
[/sourcecode]

If you run that IE7 or IE8 it’ll fail!

The cool thing is, without having created unit tests for SignalsLite.js, I would never have known that could be an issue, and instead would continue to scratch my head when stuff like that broke in IE7/8. I found this because I was trying to export SignalLite from within a closure (I try to always define my stuff inside of closures to avoid namespace pollution), with this:

[sourcecode language=”javascript”]
(function() { "use strict"; // standard header

// naming inline functions makes the debug console easier to read.
window.SignalLite = function SignalLite() {
// stuff
}
SignalLite.prototype = {
// methods
};

// The fix is to use an anonymous function, or export elsewhere:
// window.SignalLite = SignalLite;

})();
[/sourcecode]

For whatever reason, that doesn’t work in IE7 and IE8. Unit testing is crazy!

If you are interested, go fork SignalsLite.json GitHib.

P.S. You can run the SignalsLite.js unit tests here to see the fail for yourself! I disabled that test in the SignalsLite.js tests.

]]>
http://www.unfocus.com/2012/06/12/signalslite-js-unit-testing-and-buggy-ie/feed/ 0
Backstage2D – the GPU Augmented Flash Display List http://www.unfocus.com/2012/05/04/backstage2d-the-gpu-augmented-flash-display-list/ http://www.unfocus.com/2012/05/04/backstage2d-the-gpu-augmented-flash-display-list/#comments Fri, 04 May 2012 21:42:14 +0000 http://www.unfocus.com/?p=4180 Continue reading "Backstage2D – the GPU Augmented Flash Display List"]]> I’ve been playing with some 2D API ideas built on top of Flash’s Stage3D and Actionscript 3.0. I call it Backstage2D, the GPU augmented flash display list.

Currently, Backstage2D’s code base is mostly a playground for proof of concept of some API ideas. Some stuff in this post may not match the git repo (for example, I’m still using “layer” instead of “surface”). There’s a bunch left to do, but it is working enough to run a modified version of MoleHill_BunnMark that some folks from Adobe put together (I actually lifted most of my GPU code from that example code, heh). The BunnyMark example was adapted from Iain Lobb’s BunnyMark, with some additions from Phillipe Elsass. You can view the Backstage2D version of BunnyMark here (and check out the original BunnyMark MoleHill here).

Fork Backstage2D at GitHub.

The rest of this post describes the thought process that went into Backstage2D.

The Flash AS3 display list API is not the best way to utilize the massively parallel capabilities of a GPU, and deal with the other limitations of a CPU/GPU architecture. The display list’s deeply nestable DisplayObject metaphor, and all the fun filters and blend modes just doesn’t translate well to very parallel, flat GPU hardware renderer. All of this is especially true on mobile like iPhones, iPads and Android devices, and that’s the primary target for Backstage2D.

With an API like the traditional flash display list, it’s easy to create situations that can’t easily be batched due to branching operations and other things which change the GPU state, and break parallel processing – slowing everything down. You see this in Adobe AIR’s GPU render mode, where seemingly random things can have a huge negative impact on performance. Behind the scenes AIR attempts to break the content into batches to speed things up. The use of certain features, or normal features in certain ways can drop you out of a batch. When performance degradation happens, it’s not always clear why. Because of that, to get great performance you must target just a subset of the normal features, and apply a lot of discipline to make sure everything keeps working smoothly.

I wanted do something different. I wanted to play with an API that is intentionally unlike the Flash display list – one designed to help the implementor (Flash developer or designer) understand how to arrange their content, so that it renders very quickly, even on mobile devices – and still get the benefits of all the glorious Flash stuff we are all used to.

Here are some of the primary principles I came up with, which impact the API:

  • In order to take advantage of the parallel nature of GPUs, we need to batch many Quads (think, DisplayObject) into batches. The API should make batches easy to understand and use, so there’s no guessing about what’s going on.
  • GPUs like shallow content – they draw a lot triangles all at the same time. There is no nesting on the GPU, so while some form of organization is necessary, the infinite nesting model must be reigned in.
  • Backstage2D shouldn’t do too much in an automagic kind of way. Guessing about the impact of nesting things a certain way, or using a blend mode, or the performance impact of using certain features translates into extra effort and cost during production because of the unpredictable negative impact features can have on performance. Features should work as you expect them to, and the performance impacts of doing certain things should be clear.
  • Think of the GPU as a remote server that you send instructions to. Uploading things like Textures to the GPU from system memory is slow (especially on mobile). Backstage2D should make these stress points clear.
  • Flash’s vector engine is tops, and working with Bitmaps (and sprite sheets) sucks! The API should enable the continued use of the display list, in GPU friendly ways. Drawing vector art on the GPU is hard, and is ugly anyway. So leverage the CPU rasterizer, and make sure the API makes the GPU upload bandwidth and render time overhead clear.
  • Backstage2D objects shouldn’t look like traditional display list objects – we’ll use names other than Sprite, MovieClip, DisplayObject, etc.

Of these, batching is the starting point, since it is the most necessary for advanced performance, and effects how data must be organized the most. You can draw each Quad (think Flash DisplayObject, or Sprite) individually by setting up the vertex, program, texture, etc. data for each quad, and calling drawTriangle for each and every Sprite. But the GPU can’t optimize to run in parallel if you do that – most of its processing cores end up underutilized in that model.

Batching let’s more than one quad be draw simultaneously, but there are limitations – Every item in a batch must use the same vector data (a giant array of x,y,other data), single texture and other state information, like blend modes. Additionally, the entire batch must be drawn without interruption, which means you can’t insert items from other batches (with other state settings like a different blend mode) in the middle of the batch.

So batches resemble layers, or surfaces. The model for Backstage2D will be a series of stacked surfaces, instead of a deeply nested tree structure starting at the root.

In this paradigm, the surface gets batched, and the children it contains get rendered in parallel – perfect for GPUs. To eliminate batch breaking APIs, certain “state changing” operations can be applied to only an entire layer, not to each element – operations like blend mode settings – or adding and removing elements from a surface. The limitations of surface API should help the implementor understand the impact of doing certain things. If you need to have 100 elements, and every other element has a blend mode of Multiply, while the one below it has a blend mode of Normal – in the traditional Flash API, this is fine, and can actually run pretty well. On the GPU, all 100 elements must be rendered individually in 100 distinct surfaces. Having that many surfaces feels heavy because it is heavy.

Texture changes are one of the things that break batching – a Shader can deal with only one texture (well, actually up to 8 – but that makes the pixel shader more expensive to run), so a set of elements in a batch must be combined into a sprite sheet or texture atlas. If you’ve tried to use a texture atlas in another 2D rendering engine, you may have noticed these are a pain to deal with – and usually it requires setting them up manually before compilation. This is one thing that Backstage2D handles for you – at runtime – in an automagic kind of way.

This feature was actually done for a bunch of reasons. I’d like to add is a resolution (screen DPI) independent measurement mode, where assets get generated on each device an app runs on, from high quality vector art, for exactly the necessary DPI the system is running at, and scaled to real life sizes. Type specified at 12-point, should truly measure at 12-point.

Additionally, Flash vector art looks great (especially with the new 16X16 quality modes), but they look their best when rendered to match the screen exactly. Resizing prerendered vector art can ruin the beautiful anti-aliasing in vector art. Proper sizing can also help performance with older hardware like an iPhone 3GS, which is actually pretty capable, but doesn’t cope well with iPhone 4 retina screen sized material (4x more pixels than will be displayed).

Setting all this up is expensive – especially generating the sprite sheet. But just setting up vector data and loading even predigested textures is already expensive enough that you wouldn’t want to do those tasks while your app is running some smooth animation – it will cause missed frames, and your users will notice. So Backstage2d’s API should guid the user to avoid doing expensive things while an app or animation is running. It exposes a build, load, and/or upload commands per layer. That way, the implementer always knows that what they are doing is computationally expensive (down the road, the plan would be to move much of that into concurrency – more on that another time).

The characteristics of this are very different from normal Flash, which is to load only the minimum of whats needed, when it’s needed, and try to keep as much as you can off the display list. In the Backstage2D model (in the standard surface type anyway), an entire surface, and all it’s children (called “Actors” to avoid colliding with AS3’s “Sprite”, etc.) gets rendered up front to a big TextureAtlas, and stored in memory or on disk. How to optimize and organize your assets to avoid running out of memory becomes an entirely different matter from the way to optimize for the CPU. A surface will have an associated sprite sheet bitmapData asset though, which can be measured.

With these restrictions in mind, the idea would be to create a variety of surface types to suit differing kinds of content. For classic content, a standard static Quads surface (done!), still frame animations (sprite sheet animations – generated at runtime), tweened animations (inverse kinematics – the bone tool), and streaming animations (dynamic MovieClips, large MovieClips, or video) – maybe even some surfaces useful for standard UI, like scroll panes. For more advanced 2D assets, a variety of different mesh layer types could be added (that’s where GPU stage3D programming gets fun!).

I’d love to flesh this out with more features, including an animation subsystem that would include a couple of different Animation display types. Alas, free time is short, and I’ll probably never get to it. But I already spent a lot of time on this (I broke my foot, and was couch bound for a while) so thought I’d share where I got to. 🙂

]]>
http://www.unfocus.com/2012/05/04/backstage2d-the-gpu-augmented-flash-display-list/feed/ 2
Scripts n Styles Update 3.1 http://www.unfocus.com/2012/05/03/scripts-n-styles-update-3-1/ http://www.unfocus.com/2012/05/03/scripts-n-styles-update-3-1/#respond Thu, 03 May 2012 19:11:16 +0000 http://www.unfocus.com/?p=4191 Continue reading "Scripts n Styles Update 3.1"]]> Scripts n Styles received a major update today. The two big features added are LESS.js support and Dynamic Shortcodes! The “Global” Settings page now has a LESS editor with syntax highlighting (via CodeMirror) and on-the-fly compiling so you can see how it’ll be outputted on the theme-side. The per-page meta-box has gained a new tab in which you can create one-off shortcodes which can contain arbitrary HTML content.

Scripts n Styles is a free OpenSource GPL project that you can fork and contribute to on github! (You can also fork and contribute to CodeMirror and LESS.js)

As a Shortcode example: I placed the following html into the Shortcodes tab and gave it the name “tweet test”.

[code lang=”html”]
<a href="https://twitter.com/share" data-via="WraithKenny" data-size="large" data-related="unFocusProjects" data-hashtags="ScriptsnStyles">Tweet</a>
<script>!function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");</script>
[/code]

I then use the shortcode [[sns_shortcode name=”tweet test”]] to display: [sns_shortcode name=”tweet test”]

]]>
http://www.unfocus.com/2012/05/03/scripts-n-styles-update-3-1/feed/ 0
Fast AS3 Signals with SignalsLite http://www.unfocus.com/2011/12/05/fast-as3-signals-with-signalslite/ http://www.unfocus.com/2011/12/05/fast-as3-signals-with-signalslite/#comments Tue, 06 Dec 2011 01:18:15 +0000 http://www.unfocus.com/?p=4116 Continue reading "Fast AS3 Signals with SignalsLite"]]> I was playing around and ended up writing a lite Signals class (ok, 3 classes). The set works like a basic AS3 Signal, minus most of the extra functionality of AS3 Signals (run-time dispatching argument type checking as one example). The goal was to create a very fast Signal dispatcher, with very little overhead, and to dispatch with absolutely no heap allocation (check, check and check) – targeted mostly for mobile (AIR). Regular AS3 Signals does well, but it seemed to have a lot of extra stuff that I don’t need – and this was a fun kind of exercise anyway.

Some quick numbers from the performance-test with 1,000,000 iterations on a Core 2 Duo 2.6GHz (in milliseconds):

Func call time: 15
Runnable call time: 5
Event (1 listener) time: 863
Signal (1 listener) time: 260
SignalLite (1 listener) time: 232
RunnableSignal (1 listener) time: 56

Func call (10 listeners) time: 190
Runnable call (10 listeners) time: 399
Event (10 listeners) time: 2757
Signal (10 listeners) time: 741
SignalLite (10 listeners) time: 725
RunnableSignal (10 listeners) time: 221

The bold line is a vanilla SignalLite, and the line above Robert Penner’s AS3 Signals. They are pretty close, but SignalsLite takes a modest edge. But let’s look at the same test on iOS (iPhone 4S) with 100,000 iterations:

Func call time: 171
Runnable call time: 26
Event (1 listener) time: 3723
Signal (1 listener) time: 789
SignalLite (1 listener) time: 481
RunnableSignal (1 listener) time: 117

Func call (10 listeners) time: 2004
Runnable call (10 listeners) time: 1892
Event (10 listeners) time: 9217
Signal (10 listeners) time: 4030
SignalLite (10 listeners) time: 2074
RunnableSignal (10 listeners) time: 498

On iPhone you can see that SignalLite is almost twice as fast as AS3 Signals – a more substantial difference than on desktop. I’m not sure why that is, maybe the AOT compiler can optimize something about SignalLite better – IDK, but it sure is fast!

Then there’s that last line in each group – RunnableSignal. Now your talking speed. That one also solves a particular problem with function callback systems that they all seem to have – there is no compile time function signature checking. You have to wait until the thing runs, and then find out you are taking the wrong number of arguments, or the wrong type, etc. But, solving one problem (compile time type checking), solves the other (speed), and that brings us to SignalTyped which RunnableSignal in the test above extends (I’ll probably rename at some point).

SignalTyped is beginnings of a fast executing type safe implementation of AS3 Signals. The idea is, you extend 2 classes – SignalTyped and SlotLite. SignalTyped is effectively an abstract class – you must extend it and implement the dispatch method, and the constructor (at least for now, I’m looking for better ways to handle this). It takes a bit of boilerplate to implement this in a class that would expose signals. This example is based on the performance test from Jackson Dunstan’s CallbackTest which I borrowed (I hope that’s ok!):

[sourcecode language=”actionscript3″]
// Interface for your class that might have listeners for the SignalTyped.
// Make one of these per listener type.
interface IRunnable {
function run(): void;
}

// Custom Slot has a specific property for the Runnable class.
class RunnableSlot extends SlotLite
{
public function RunnableSlot( runnable:IRunnable ) {
this.runnable = runnable;
}
public var runnable:IRunnable = new EmptyRunnable;
}

// An empty IRunnable class for first node.
class EmptyRunnable implements IRunnable {
public function run():void {};
}

// You need one of these per dispatch type.
class RunnableSignal extends SignalTyped
{
// last and first must be set to the typed Slot.
public function RunnableSignal() {
last = first = new RunnableSlot;
}

// implement the dispatch method to call the runnable prop directly
// It’s easy to have it take and dispatch any type you want – with compile time type checking!
public function dispatchRunnable():void
{
var node:RunnableSlot = first as RunnableSlot;
while ( node = (node.next as RunnableSlot) ) {
node.runnable.run(); // FAST!
}
}
}
[/sourcecode]

That’s all necessary for the implementation requirements – a lot of boilerplate, I admit. Then you expose that in a class that might use it all:

[sourcecode language=”actionscript3″]
class MyDisplayObject
{
// could probably make this a getter..
public var signaled:RunnableSignal = new RunnableSignal;
}
[/sourcecode]

Now for the consumer to use this, it’s just a bit more boilerplate than a normal signal:

[sourcecode language=”actionscript3″]
class MyConsumerOfSignalLite implements IRunnable // boilerplate point 1
{
public function MyConsumerOfSignalLite()
{
var dspObj:MyDisplayObject = new MyDisplayObject();
// add the signal (boilerplate point 2 – normal)
dspObj.signaled.addSlot( this );
}

// boilerplate 3 – normal, but more strict – naming is specific – FAST!
public function run(): void {
// do whatever when signals
}
}
// boilerplates 2 and 3 are normal for any signal, except the strictness of #3
[/sourcecode]

What’s cool about this is you get compile time type checking for your method signature, and the performance improvement that comes with skipping those checks at runtime.

I’m also thinking about a slightly different signal API that would be more like the Robot Legs’ contract system – think signals by contract – I’m working on it. Since we would be implementing a defined interface per signal type, we could boil the add methods and signal nodes down to one method to add all the listeners of a single object – one add method per dispatching class, instead of one per signal on the dispatching class. This could lead to a reduction in boilerplate. We’d filter by interface type instead of using multiple signal.add nodes and methods. So – improved runtime performance, reduction in (usage) boilerplate (if not implementation) and compile time type checking. I love it!

Note – I tested none of the example in this post, and the code in github is all very early stage stuff. The performance-test class works though – give it a try!

Oh, here’s the github repo:
https://github.com/CaptainN/SignalsLite

]]>
http://www.unfocus.com/2011/12/05/fast-as3-signals-with-signalslite/feed/ 7
Adobe’s Flash/AIR Messaging Nightmare http://www.unfocus.com/2011/11/10/adobes-flashair-messaging-nightmare/ http://www.unfocus.com/2011/11/10/adobes-flashair-messaging-nightmare/#comments Thu, 10 Nov 2011 22:45:13 +0000 http://www.unfocus.com/?p=718 Continue reading "Adobe’s Flash/AIR Messaging Nightmare"]]> Update: Mike Chambers posted an explanation and clarification on where Adobe is headed with Flash and AIR. Update 2: TechCrunch picks up (part of) the narrative.

I published an old post with my thoughts on the “Flash is Dead” thing that pops up routinely in media circles after anything happens to shake things up (like an Apple ban on Flash, or Adobe dropping a supported platform, etc.) yesterday. I optimistically highlighted in that piece the promise that AIR technology represents – it’s even in the title “Flash and AIR, Nothing But Opportunity“. I really believe the technology represents, and could fulfill all the promise those of us down in the weeds perceive. I also believe that Adobe’s Flash Platform engineers and evangelists also see that promise, and would like to see it fulfilled.

Yesterday Adobe unceremoniously dropped support for an entire class of platforms. No more Flash Player in mobile browsers. It’s not a terrible technical decision – working in AIR and native app land offers a ton more flexibility. It even makes business sense. Browser makers are increasingly hostile to Flash – Apple has never let it in the door on iOS (and never will), and Microsoft announced plans to kill off plugins even on the desktop in Windows 8 Metro interface. Browsers have become hostile territory for Flash, so it makes sense to move emphasis in the two directions the industry is headed – app store apps with AIR (which no one knows about) and HTML5 for browsers. In an important way, this does mean Flash is dead – it’s not going to be in the browser going forward. It really is out of Adobe’s control.

But there’s a problem. The longer Adobe’s bumbles the messaging, the harder it is to say for sure whether there is a lack of commitment to their platform (including AIR), or if it is truly just a PR problem. This kind of announcement had an easy to predict effect on Flash’s brand, yet there was no attempt to get out in front of that narrative that would show they are committed to the larger “Flash Platform” of which AIR is an important part going forward. In the non-technical parts of the industry – the media, managers, and creative side of production teams – they all heard Adobe Flash is out of mobile – use HTML5. It’s even worse in client land, where the term “HTML5 app” is used regularly along with “app store” – this news was so harmful to them, that clients with existing Flash content, which can be ported to the app space easily with AIR, are really freaking out. I can tell them about AIR all I want, but it’s hard for me to counteract all the media buzz (repetition is reality – brain science).

But what if they got the right message. This kind of move could represent a real intent on the part of Adobe’s leadership to get out of the Flash Platform altogether, and maybe out of the platform space entirely, and focus instead only on tooling to produce for the platform commons that HTML5 represents. Look at the kinds of decisions they’ve made recently. Adobe has essentially dropped internal support for their “Flash Platform” on every system platform they can, by either straight up dumping it (Linux, mobile flash, TV), or by farming out porting and support to partners like RIM.

On the other hand, Adobe and Flash evangelists and engineers seem committed to the “Flash Platform” which in an un-articulated narrative (narrative – it’s how we think – more brain science), really means AIR in app stores (mobile and desktop), but I’m not sure I’m getting the same message from the real decision makers at Adobe. I don’t know if it’s intent, or just plain old bad PR judgement, but it feels like I’m standing on the greasy platform, and it’s getting pretty tough to hold my balance. Some folks are already sliding off.

I think they are in it for the long haul, and they’ve even built some of their own apps on the little known Flash based mobile app technology that is AIR. But guessing someone’s intent is problematic – that only makes the PR problem clearer. I shouldn’t have to guess.

It boils down to this. I know technology, and I know the Flash Platform. I know it has merit and potential. But if people can’t tell if the decision makers at Adobe are serious about supporting it into the future, it’s going to be a tough haul to convince anyone to build anything on that platform. I already know a few platforms, including HTML, learning a new one isn’t scary, but I really prefer Flash and AIR because of it’s potential and even it’s legacy, which has value (despite the tar Steve Jobs dumped on it). If Adobe can’t or won’t make it clear that they are committed to AIR and the Flash Platform, I’ll have to find an alternative – and the decision won’t be mine. At this point, we need a clear unambiguous statement of intent from Adobe – are you committed to the Flash Platform and AIR, or not? A public roadmap wouldn’t hurt either.

]]>
http://www.unfocus.com/2011/11/10/adobes-flashair-messaging-nightmare/feed/ 5
Flash and AIR, Nothing But Opportunity http://www.unfocus.com/2011/11/09/flash-and-air-nothing-but-opportunity/ http://www.unfocus.com/2011/11/09/flash-and-air-nothing-but-opportunity/#comments Thu, 10 Nov 2011 03:39:27 +0000 http://www.unfocus.com/?p=680 Continue reading "Flash and AIR, Nothing But Opportunity"]]> Preface: I wrote this one of the last few times the Flash is dead thing made the media rounds, because it seems as though many participants in the discussion are simply missing the bigger picture, that the market for rich interactive work is splitting between app store apps (native applications), and desktop browser-based apps (websites), and that those divisions are deep enough to require different development mindsets. The post is overly long – I don’t have an editor – but I figured I’d post it in its current draft state, since this keeps coming up, and so I don’t have to noodle with it anymore. 😛 So here it is. (Instant update: Lee Brimelow has said similar things in fewer words on his blog Update 2: Thibault Imbert chimes in. Update 3: Mike Chambers rolls the narrative. Now back to making awesome!).

In the technology business, if you aren’t looking ahead, you are being left behind. There is fundamental shift occurring in the content technology space, where Flash and HTML live their happy lives. This shift has mostly been explained using old terms, like “apps” and “HTML5 vs. Flash” – these explanations miss the point. They all describe how things were yesterday and are today, but miss how they will be tomorrow. The browser has been and is today, the primary means of application and content delivery. A new set of opportunities for delivering content are changing all that. The Split puts the traditional desktop browser market on one side, and app stores on new platforms, with new hardware, and new interface paradigms on the other.

App stores should be more broadly called content stores, because the line between apps and other kinds of content is pretty thin. Market specific content stores have been around for a while already on the desktop. Game shops like Steam and Direct2Drive already make up the lion’s share of the PC games market, and iTunes was already a form of an app store, before apps where apps.

The companies behind every platform are adopting apps stores, including all major operating systems on traditional PCs, including OSX, and soon Windows. Open source trail blazers like Ubuntu have actually had something like app stores for a long while now. Additionally, more and more types of content are being pulled into them, from apps, to music and movies, to Magazines and local newspapers. The models for monetization are so much clearer, and the tools to take advantage of the various models are already built, and for the consumer, very convenient. App stores are the new reality.

To really understand why this is happening, and what it means for those of us who make a living in the weeds, we need to understand where we are, and how we got here.

The PC Era

In the early days of personal computing, “applications” (or “programs”) were the hot action. You needed something to do with your new beige personal computer (PC), so you bought (or borrowed) software or other types of content on diskettes, and later CD-ROM (oh the magic) and installed that software to run on your PC or Mac. It was an offline process, but it was the only realistic way to go. Even if you had access to the internet, you weren’t going to download megabytes of data over your cutting edge 14.4KB fax/modem connection. Traditional forms of acquisition ruled in those days. You had to take yourself to the store, and buy a box or a publication or whatever else, to obtain content – probably paying with cash.

When the internet hit mainstream in the 90s, and data speeds increased, the transition from “applications” delivered through boxed diskettes, to continuously updated “websites” began. The internet had some advantages over boxed content. The biggest was that accessing a web site through the internet was exceedingly convenient for consumers. Far more convenient than traveling to the store and buying a box with a CD of clip art on it. For content producers there is also a sense of limitless shelf space compared with traditional retail outlets, so they were quick to try to carve out advantage there. Search engines and content indexing services like Google and Yahoo! made a killing on both ends by providing a way for content producers to get their content in front of users.

Broadband completed the transition. At the dawn of the new millennium and “the internet,” became the primary means of content and application delivery (aside from a few important smaller markets like games and productivity apps). The browser was the primary means of application and content delivery, and for good reason. The content is easy to access from multiple platforms, and is super convenient. All you need is an internet connection, and a browser.

A Flash of brilliance

At around the same time, Microsoft mostly won “the browser wars” with Internet Explorer 6, and basically stopped forward movement in their browser, and for many years, the internet – the commerce in the browser era’s “website” based economy was able to mature. The stagnant development of the dominant browser platform created a challenging environment, one in which it’s easy to see why Flash was able to thrive.

Flash brought many improvements over the browser, through constant performance and scriptability advancements, as well as significant additional features the browsers in the aggregate simply couldn’t match – video being an important notable feature. Additionally, Flash provided consistency across browsers and operating systems, and comparably great performance, when measured against HTML and JavaScript. A browser-based app simply couldn’t (still can’t) match it. Flash in the browser became the go to platform for serious interactive work on the internet. You just couldn’t get similar levels of awesome out of IE6 and the rest of the browsers of the time.

All good things

The split started to happen in 2006. On the PC, which really means in the PC browser, Adobe was getting more serious about the application space in the browser by releasing the first version of Flash with AVM2 (and s 3.0), a much more stable foundation than Actionscript 2.0 had been, along with an update to its application framework, Flex that took advantage of the improvements to Actionscript 3.0. This helped move trends in Flash’s direction, as seemingly every great site was build using the plugin technology. IE7 had come out that same year, but it only added to developers’ pain in the short-term, and it still wasn’t the robust interoperable platform that browser ecosystem needed to compete in the applications space. So in that space, movement continued toward Flash.

This could be considered the golden age for Flash. Flash ruled the content space during that time, in everything from banner ads, to browser-based games, to anything dealing with charts, and data (so-called RIAs), to just about all the video delivery on the internet.

Browsers didn’t come without problems. They have been slow to innovate, incompatible with one another – universally slow, buggy and crashy – and often full of horrible security holes (especially IE – the dominant player). They were mired in standards battles, forks, company and social politics (open source/EU fines) – but mostly, the leader – Microsoft with IE6, just held everything up. On top of all that, it was difficult for content producers, like traditional newspapers, to find revenue sources other than ad systems. The market was set for change.

That’s about when Apple fired the first warning shots across the bows of the PC browser fleet, by releasing the first iPhone, which could browse the internet, but didn’t run Flash. A brand new platform – software and hardware, with a brand new interface paradigm – touch, instead of mouse and keyboard. This would be a platform built from the lessons of the browser era, and it provided a wide open space for Apple to do what it does best. They rapidly iterated on their ecosystem, and came up with the overwhelmingly successful App Store, a system that seemingly everyone wanted in to. This was a system that came with multiple obvious revenue systems built right in – app sales, technology cross-licensing, advertising, etc. – all things that could be done in the browser space, but the app store made exceedingly convenient, to both producers and consumers. Apple catered to that demand masterfully, and over time expanded opportunities to include, in-app purchases, magazine publishing platforms, and subscriptions services, among others.

In the same way the internet – the modern PC era – had provided enough advantages over the previous content delivery systems to overshadow any of its shortcomings, the App Store model would provide enough promise to overshadow its possible shortcomings measured against its predecessor. App stores proved so compelling, and so big a threat to the existing browser-based models, it almost immediately ended a cozy relationship between Apple and Google, who ruled the browser era, as the gatekeeper to content, and the owner of essentially all advertising on the web. Google moved quickly to duplicate the app system  for Android, and the other platform makers – WebOS, and Microsoft Windows Phone 7 Series – have been playing catch-up ever since. Eventually, Apple brought the app store system to the desktop in OSX Lion, and even Microsoft is picking it up in their Windows 8 Metro interface for full app store coverage in the traditional PC markets.

The rapidly evolving iPhone (later iOS) platform created new ways to think about a lot things. The most important new things were app and content delivery, and revenue sources through new monetization strategies. The Apple App Store changed everything.

The end of an era

When Apple released the iPad in April, 2010, Steve Jobs announced the “end of the PC era.” With the release of the iPad Apple did nothing less than complete and publish the rule book rewrite they began with the iPhone. More than anyone else, the folks at Apple seemed to understand that there is a divide between the “PC era” – which is really the “PC browser era” – and the new app store era. They understood that these two are on two different trajectories, and the app store era will supersede the browser. From now on, for better or for worse, applications would exist in App Stores, and websites would just be websites.

In the same month Apple announced the iPad, Steve Jobs followed up with a special letter in his open letter titled “Thoughts on Flash”, which highlighted some of the negatives of the browser-based “PC era,” where Flash was settling in as the dominant platform. The letter also exploited a division between the Flash crowd and the standards and open source crowds. And he directly addressed the “full web,” – Adobe’s tone-deaf name for “the PC era”. In that direct critique Jobs highlighted the disadvantages of the new app store model, by putting the “full web” flash apps in the “free” – or unprofitable box, and painting the technology with the old brush. Even the main part of the label “PC” is an old term, from a time that came before the modern browser era.

That letter was truly a brilliant piece of market positioning magic, but it was ultimately unnecessary, and Apple has since backed off. The app store model provides a marvelous promise without the need to degenerate the old browser based economy. Content makers, all of whom struggled to find revenue from websites, now have multiple new revenue streams to explore, through app sales, and licensing, and other kinds of content transactions within apps.

During the PC Era, browsers dominated users’ mind share, and time on the PC, native applications were still the clear leaders in performance, access to hardware, and close integration with the underlying OS platform. Despite that advantage, native apps were hamstrung by seemingly insurmountable inconvenience – the boxed distribution model – an inconvenience that most online distribution stores of the time simply duplicated (download, unzip setup, run setup, store setup file somewhere in case you lose your hard drive, etc.).

App stores solve these native application distribution problems by providing a central hub for content, simple e-commerce (no more credit card into the random unverified website), and can be integrated with the legacy system – the website.

My head hurts.

So what does this all mean for us, the front line Flash developers? It means opportunity. There are now three platforms to develop for!

Yeah, that’s right – three.

The transition to app stores on the desktop will take a while to roll out, and old habbits die hard, and Flash will stick around in that space for .. well, as long as that space exists. There are still a chunk of 98%+ of the user out there on the internet, still accessing the web through their existing PCs. That won’t change overnight. Even initiatives like Microsoft’s plugin blockade with Windows 8 and Metro mode take effect, they will come hand in hand with app stores, so there’s a workaround.

But let’s get real for a second, the Flash Player – in the browser – sits at the core of entire new lucrative markets on the PC, in the browser. Take browser era social gaming and Zynga – a game company, with a quirky social media, micro-transaction game library, integrated with Facebook’s social platform, is more profitable than top traditional PC game companies like E.A. Flash in the browser is having a grand time. Stage3d was just released, Unreal Engine was shown running on it at MAX. Flash is still tops for the best kinds of awesome on the internet.

Second, you have all the HTML5 opportunity – not directly relevant for Flash devs (yet), but for those of us that have had their hands in both worlds this whole time, this is exciting! HTML, JavaScript and CSS are finally getting to the point where you can build really awesome stuff with it. And, for app store monetization to work, discovery is key. Searchable HTML (and HTML5) will dominate for that. App stores are easy to search and easy to link into – from a website. Websites aren’t going anywhere – in every way, the app store model can’t work without the browser based internet.

And finally, the new kid on the block, the app store. For Flash devs, that means AIR – which is essentially Flash for app stores. If you have Flash (or even HTML) skills to burn, you can almost, just recompile your Flash app for AIR. Adobe has built this amazing tool – the best kept secret they didn’t mean to keep (don’t get me started on their PR). The sky is the start with AIR for Mobile, never mind the limit (Apollo indeed). The best part is, once you build for one app store with AIR, you can build for basically all of them, with very little additional effort.

Have a look at Machinarium. A traditionally packaged standalone desktop app, made with Flash, and distributed in a box through traditional outlets (and the specialty PC app stores, like Steam) with an online demo that runs in the desktop browser in Flash Player. Now republished for the Apple App Store with AIR and some optimizations, to run on iPad as a native app.

So where are we? Flash is alive and kicking – thriving even – despite the clueless ramblings of know-nothing media pundits and their bandwagon seeking behavior. You don’t need to listen to them, just get out there, and make cool apps/websites/games/whatever else with the same technology you’ve always used. These are exciting times.

 

]]>
http://www.unfocus.com/2011/11/09/flash-and-air-nothing-but-opportunity/feed/ 1
Performance Benchmarks with AIR 2.7 for iOS http://www.unfocus.com/2011/07/03/performance-benchmarks-with-air-2-7-for-ios/ http://www.unfocus.com/2011/07/03/performance-benchmarks-with-air-2-7-for-ios/#comments Sun, 03 Jul 2011 06:14:19 +0000 http://www.unfocus.com/?p=632 Continue reading "Performance Benchmarks with AIR 2.7 for iOS"]]> I’ve been working on this Benchmark based on Iain Lobb’s BunnyMark. Being a bit confused sometimes about what things speed things up or slow things down, I didn’t want to guess anymore, so I grabbed Iain’s code base (cause I’m lazy, and didn’t want to start from scratch), and added some tests for things I suspect are slowing things down (or speeding things up). I think this will also help shed some light on why some folks see a huge gain in AIR 2.7 CPU mode, while others do not.

Some caveats – this only tests instances of flash.display.Bitmap on the display list, at the size they are, moving the way they move. It’s on my list to add Blitting (I have some initial work on that done, thanks to Iain, but I need to add the rotation, and alpha settings to it), and I’d like to add a vector test, and maybe some extra sized Bitmaps (I’ve heard that makes a difference).

Enough! Here are some results – quality had no effect on GPU mode, so I included only one line:

Note: some are reporting they see a difference in GPU mode, but I still don’t. Update: It appears some users are confusing “Mobile Performance Tester” with BunnyMark, which explains the discrepancy. BunnyMark is not currently in any App Store, which is one key distinguishing feature. 😉

BunnyMark Results – 500 Bunnies
Alpha
Rotation
CaB
CaBM
iPhone 3GS – GPU
FPS 24 18 17 22 13 13 19 1 19
iPhone 3GS – CPU
FPS-L 28 21 19 9 19 5 7 5 5
FPS-M 28 21 19 4 18 3 7 5 3
FPS-H 28 21 19 3 18 2 7 5 2
FPS-B 28 21 19 3 18 2 7 5 2
iPhone 4 (Retina) – GPU
FPS 25 21 20 25 13 13 16 0.5 16
iPhone 4 (Retina) – CPU
FPS-L 32 23 20 10 21 6 8 6 7
FPS-M 32 23 20 5 20 3 8 6 3
FPS-H 32 23 19 4 20 2 8 6 2
FPS-B 32 23 19 4 20 2 8 6 2

Notes about the Benchmark:

  • In general, the CPU mode seems pretty consistent with the way you’d expect things to work on the desktop – the same optimizations you’d apply for the browser plugin, you’d also apply to mobile for CPU mode.
  • Rotation in this benchmark is not continuous – the Bunny graphics are only rotated at the edge of the stage, which is why cacheAsBitmap works to speed those up. If they were constantly updated, it would likely be much more expensive on CPU mode (probably more like rotation without CaB).
  • Alpha is continuous – the alpha value of each Bunny is based on the y position and is updated every frame. I would like to add a mode similar to the rotation, so see what effect CaB has on alpha transparent objects that don’t constantly change.
  • iPhone 4 and 3GS numbers aren’t directly comparable for practical purposes. The Bitmaps on the screen on 3GS take up much more real estate, since the 3GS screen res has 1/4 as many pixels as the iPhone 4. In a normal app, we’d probably resize things to look comparable between the two devices. I’ll try to add a mode that makes this more comparable (because I suspect we’ll find that 3GS can keep up with iPhone 4 with similar looking content).
  • Touching the screen seems to cost about 4 fps across the board.
  • I think there may be an issue with returning to rotation = 0 costing some performance in GPU mode. Still have to test that.
  • I’m definitely getting some variance on default speeds – basically, before any settings are messed with on some runs I get the faster numbers (the baseline numbers in the tables above). Other times it runs at default settings a couple of FPS slower (on start, or after resetting the switches). With any of the settings, everything is consistent across multiple runs.

 

It’d be nice to have more benchmarks for more devices, but I only have the above devices available. This should run just fine on Android, Blackberry Playbook, and iPads. If anyone wants to contribute a set of benchmarks, hit the comments. Here is the source. One of these days I’ll make another post, and try to draw some conclusions, maybe wrap the bullet points into a narrative, and edit some of this, but the tables are there, and the source code, and that’s the important stuff.

In the midst of playing with this benchmark, I found (or was pointed at) some great resources. Here are some of them:

Here is the Benchmark to see it in action:

]]>
http://www.unfocus.com/2011/07/03/performance-benchmarks-with-air-2-7-for-ios/feed/ 3
Scripts n Styles update 2.0.1 http://www.unfocus.com/2011/06/24/scripts-n-styles-update-2-0-1/ http://www.unfocus.com/2011/06/24/scripts-n-styles-update-2-0-1/#respond Fri, 24 Jun 2011 20:24:37 +0000 http://www.unfocus.com/?p=634 Scripts n Styles is a tool to allow admins (and editors in single installs) to add scripts and styles without editing template files, or worrying about authors overwriting the code (code is stripped when an author updates since they don’t have permission to use unfiltered html).

Improvements in version 2.0.1:

The meta box has been improved to provide a tabbed interface for less clutter, and syntax highlight and formating is added using the open-source CodeMirror 2.1.

An option has been added to allow adding script to the head element in addition to the traditional bottom of the page spot.

An Options page (under Tools) has been added so you can add Script n Styles to the entire site, rather then just the individual posts and pages.

Some minor code improvements:

  • Better selection of post_types.
  • micro-optimization for storage of class names.
  • Defined a later priority for Scripts n Styles to print after other scripts and styles.
  • Better adherence to coding standards.
  • began contextual help (notes on capabilities).
]]>
http://www.unfocus.com/2011/06/24/scripts-n-styles-update-2-0-1/feed/ 0
What is a “Native” App? http://www.unfocus.com/2011/03/25/what-is-a-native-app/ http://www.unfocus.com/2011/03/25/what-is-a-native-app/#comments Fri, 25 Mar 2011 17:39:14 +0000 http://www.unfocus.com/?p=605 Continue reading "What is a “Native” App?"]]> I was recently asked my opinion on what makes a “native” app, and this was my response:

It depends on how you split that hair.

I think it depends on what platform level (hardware, OS, etc.) the particular user of the word native thinks that word applies to. It seems many use the word to refer to the actual bytecode and whether it matches the hardware (the CPU) – but in those cases I often see the term native used with the CPU architecture in the description – such as native ARM, or not native x86. iOS apps compiled with AIR 2.6 I’d say are compiled to native ARM bytecode.

There are other ways to parse it though – for example, it was pointed out to me that AIR for iOS apps are compiled from ABC bytecode into ARM bytecode to avoid the JIT (and Apple’s restrictions on the use of JIT), but that code still uses the virtual machine – the garbage collector, sandbox and whatnot. This gets right up to the edge of my understanding of virtual machines. But, if the use of a VM precludes an app from being called native then could .NET be native on Windows, or Dalvik apps on Android? In the case of .NET, there is even a JIT (pretty sure on that one, but not entirely so).

Then there’s the issue of targeted API (and ABI) – if an app is compiled to run on Windows, but is running in a VM on Linux, it’s probably not native (even though it’s CPU architecture probably matches), but if it runs in WINE on Linux, is that native?

Speaking of the Linux crowd – they parse their platforms even more granularity – Gnome apps, running on KDE are not native to some people, simply because they use a different GUI toolkit, though something running in an interpreted language like Python are native if they use the “native” GUI toolkit. Games are not subject to this line of reasoning – if it runs on Linux in OpenGL (without WINE) then it’s native.

I even remember reading some opinions in various places that programs not written in C are not native to Linux, and programs not written in C++ are not native to Windows – despite those programs using all the same APIs, ABIs, and not running a in VM.

So what is my opinion? As it relates to my current favorite target platform, I wouldn’t call an AIR app native – especially since it requires a 3rd party runtime to be installed separately (like on Android or desktops), and doesn’t have access to the native GUI toolkits and widgets and other OS APIs. That’s not a hard and fast opinion though, my definition of native is pretty malleable, and likely to change over time (or over the course of writing this response). I think I’d have a hard time selling the idea that a Java or AIR app is native to a client – on Android mostly because of the separate runtime requirement – and on all platforms, because of the lack of access to OS level APIs. It would feel disingenuous to call an AIR app a native app.

AIR for iOS comes closest to being reasonably called a native app – it is compiled as a complete standalone package, and runs pretty close to the metal (being compiled to ARM code) – and most importantly doesn’t require a third party runtime to be installed separately. If AIR for iOS apps had access to the native (underlying OS platform) GUI toolkit and other APIs, I would be more comfortable calling it native, though probably still wouldn’t.

Probably the best definition of “native” I could come up (which you still won’t get anywhere close to universal agreement on) is an app that comes out of using the platform maker’s tools to develop apps for the platform – XCode + Objective-C (and other supported languages) for OSX and iOS, Visual Studio for Windows and Windows Phone, Android SDK for Android – even using Adobe’s tools to make an AIR app makes it a “native” AIR app – where using HaXe may not count as native.

Generally though, as much as I could, I would try not to discuss whether or not an app development tool like AIR is native at all – especially since that term is so subjective. A project needs a particular problem solved, and if I can do that with AIR (on iOS that means it doesn’t require iOS GUI elements and conventions, or other features of iOS), then that’s what I’d recommend.

Update: AIR 3.0 closes this gap, and makes the “Native App” comparison easier because of two features; 1. Captive runtime – no more separate runtime requirement means it’s a standalone app. 2. Native Extensions – now an app has access to all the native functionality of the underlying platform. I’m comfortable calling an AIR app a Native App with AIR 3.0.

]]>
http://www.unfocus.com/2011/03/25/what-is-a-native-app/feed/ 4
WordPress Admin Bar Theme Support http://www.unfocus.com/2011/02/07/wordpress-admin-bar-theme-support/ http://www.unfocus.com/2011/02/07/wordpress-admin-bar-theme-support/#respond Tue, 08 Feb 2011 00:37:01 +0000 http://www.unfocus.com/?p=568 Continue reading "WordPress Admin Bar Theme Support"]]> Here’s a quickie for WordPress Theme designers:

If your theme is getting unwanted scroll-bars because of the new Admin Bar is WordPress 3.1, the core team included a way to handle it. Add Theme Support for it!

With a full height layout, you’ll want to avoid adding a margin or padding to a height that is already at 100% because you’ll get useless scrollbars, and no one wants that. Instead, find the first non-full height element (usually #header or some-such), and apply the margin there (either the 28px for the height of the admin bar, or add 28 to the existing margin if the element already has one). In the code snippet below I assumed you’d create an element or assign the class ‘admin-bar-fix’ to an existing element.

In your theme’s function file, add the following and modify as you see fit: (best to leave out the closing php tag though)
[cc lang=”php”]
< ?php add_action( 'after_setup_theme', 'custom_theme_setup' ); function custom_theme_setup() { add_theme_support( 'admin-bar', array( 'callback' => ‘admin_bar_bump_callback’) );
}
function admin_bar_bump_callback() { ?>

< ?php } ?>
[/cc]

This snippet is derived from the TwentyTen Theme’s function file. The callback’s original code can be found in the source. Original snippet also found commented in the source (props ocean90).

Basically, by declaring support for the new (as of 3.1) “Admin Bar,” you declare that you can handle how your theme’s content gets “bumped” (by default, it gets pushed down by 28px via a margin on the html.) Most of the time the default behavior is fine… but it’s not fine on theme’s that have a height declaration of 100% (even min-height) or that have external scripts that declare 100% height on the html/body (like Google Translate does).

WordPress’s admin needed a similar treatment but that got patched. The 28px margin is just a default to handle most normal cases. Your theme is your responsibility 🙂

Cheers! Hope this saves some time for someone!

Update: Admin Bar Shim!

If you don’t have a 100% height type of layout but are annoyed by improperly scrolling anchored links try the following.

If you add a >div id=”admin-bar-shim”> (for lack of a better name) in your theme surrounding everything inside the body except the wp_footer call (where the admin-bar gets echoed), you can add
[cc lang=”css”]
.admin-bar #admin-bar-shim {
position: fixed;
bottom: 0pt;
left: 0pt;
right: 0pt;
overflow: auto;
top: 28px;
}
[/cc]
to your style sheet, or use the method above adding the callback. This new method allows anchor links to scroll properly.

]]>
http://www.unfocus.com/2011/02/07/wordpress-admin-bar-theme-support/feed/ 0
unBrix Alpha in Android Marketplace!! http://www.unfocus.com/2010/10/22/unbrix-alpha-in-android-marketplace/ http://www.unfocus.com/2010/10/22/unbrix-alpha-in-android-marketplace/#respond Sat, 23 Oct 2010 04:31:15 +0000 http://www.unfocus.com/?p=544 Continue reading "unBrix Alpha in Android Marketplace!!"]]> My first Android app is in the market place! Built with Adobe AIR, unBrix Alpha is a quick take on the classic breakout style game. This is more of a “lite” game at this point (hence the “Alpha” suffix), but it is already more complete than many of the other Arkanoid clones in iOS App Store. There also seems to be some last minute performance problems on the Android version. :-/ I guess that’s what I get for only testing on an iPhone for most of the development. I’ll have fixes to that soon. I’m pretty sure it’s related to the scaleMode I set in Flash – the problem is if I set that the faster mode – NO_SCALE – it’s way too small on most Android devices. I’ll probably need to add some manual sizing based on measurement. Of course non of this was needed on the iPhone version.

Download unBrix Alpha and let me know what you think! I’d provide a link, but I don’t know how.

Update: I Nerfed the framerate a bit to get it to run a little smoother. I think the problem will be solved better by setting NO_SCALE, but I’ll have to do that another time (probably when I get to iPad port!). I also fixed the red line, and the icon too (I don’t know that didn’t show up last time). There is a report of the paddle jumping to one side when some users remove their finger from the screen. I haven’t been able to reproduce, but please let me know if this happens to you! Here is a link to unBrix Alpha on appbrain (it isn’t showing the update yet).

Update 2: I switched to CPU rendering, because it seems as though GPU rendering is just slower on Android devices than CPU rendering – at least in this kind of game. Anyway, this solved a lot of problems, including missing text and missing affects. I also had to set a fullScreenRect to match the original intended size of the game (iPhone 3Gs size). Doing these two things cleaned up most of the performance issues and graphics glitches. I’ll work on getting the remainder of the basics in place, like proper shutdowns – so this doesn’t run in the background like it does now (didn’t have to worry about that for iOS!).

]]>
http://www.unfocus.com/2010/10/22/unbrix-alpha-in-android-marketplace/feed/ 0
Flash iPhone Game at Silky 60FPS on 3GS http://www.unfocus.com/2010/10/01/flash-iphone-game-at-silky-60fps-on-3gs/ http://www.unfocus.com/2010/10/01/flash-iphone-game-at-silky-60fps-on-3gs/#respond Fri, 01 Oct 2010 17:57:06 +0000 http://www.unfocus.com/?p=519 Continue reading "Flash iPhone Game at Silky 60FPS on 3GS"]]> Well, it’s only a tech demo at the moment. I’ve been playing with this Breakout like game for a while, trying to learn the ins and outs of Flash mobile development – particularly as it relates to performance. I now have the unBrix demo running at close to 60FPS (59.1) – smooth as silk.

This won’t run at 60FPS on Android Flash Player plugin in the browser (or Firefox on Mac!) – this post is about the iPhone build – but here’s the web version to look at anyway.

Here is a blurry video of the thing running as a native iPhone app on a 3GS (I smoothed out the choppy splash transition in a later build by setting the BG element with cacheAsBitmapMatrix):

The most important thing was to make sure GPU acceleration was working, and to learn what things will impact performance in that area.

It turns out, there are some important differences with how GPU accelerated Flash woks compared with the traditional software renderer. In the software Flash renderer keeping your display list shallow, and sparse (using addChild/removeChild a lot) or avoiding the display list completely (by writing to BitmapData – as the Flixel game engine does) is a key optimization for performance. This is how the exploding bunny video demo is done, and why it’s so fast.

My current theory is that on GPU accelerated content (even on desktop) the reverse is true. You want to avoid CPU/system RAM to GPU/video RAM updates as much as possible – which means avoiding BitmapData updates which cause the player to upload a new texture to the GPU VRAM with every update. Because I don’t have access to the internals of the Flash Player architecture, I can’t be sure, but I think the bottle neck comes from clogging up the lanes between the CPU and GPU, and all stressing all three areas of the rendering pipeline (CPU, GPU and the bus) as they juggle around objects in memory. The key observation this conclusion is based on is the large performance impact addChild and removeChild has on the framerate. So I relentlessly avoid that in my iPhone Flash development – I precache everything, and don’t mess with the display list. This is also one reason why filters (which operate on a BitmapData representation of the DisplayObject you apply them to) are not recommended on mobile content.

Anyway, hopefully I can turn this into a full app for iPhone in a reasonable timeframe. 🙂

]]>
http://www.unfocus.com/2010/10/01/flash-iphone-game-at-silky-60fps-on-3gs/feed/ 0
Scripts n Styles http://www.unfocus.com/2010/09/15/scripts-n-styles/ http://www.unfocus.com/2010/09/15/scripts-n-styles/#respond Wed, 15 Sep 2010 17:59:25 +0000 http://www.unfocus.com/?p=502 Continue reading "Scripts n Styles"]]> Introducing a new plugin for WordPress from unFocus Projects!

Ever need to add a CSS style or some code snippet to just one page or post in WordPress? We release an admin tool to do just that.

On the post edit screen of the admin, Scripts n Styles adds a meta box where you can add JavaScript, CSS or even add class names to the body tag or the post content wrapper (as long as the theme supports wp_head, wp_footer, body_class, and post_class functions and almost all do).

The plugin is available on WordPress.org Extend (and therefor your plugin admin screen 🙂 ). You can also fork it on Github. It’s licensed GPLv2. (current version 1.0.2)

Enjoy!

]]>
http://www.unfocus.com/2010/09/15/scripts-n-styles/feed/ 0