20121221

To dick or not to dick ?


Apple has never been known to be a nice company to be around, and is very much alike its iconic co-founder in this regard : fanatically vision driven, frequently bright, proven right against the common wisdom of its detractors (and supporters) more often than its turn, but also certifiably  sociopathic …and it's alright to be all those things, really — as long as it serves a greater purpose, and works. 

In Apple's case, this manic drive has been a key factor not just in the company's survival (if barely, at times) but also in its changing and shaping the landscape of personal computing, and how we live with our gadgets on a day-to-day basis, whether you use Apple products or not.



When Jeebus was still a mere serial-baby-seal-clubber.

I believe it is no longer the case, and that Apple has jumped the shark from cruel to be kind to drunk on its own kool-aid
To wit, pretty much everything newsworthy on the Apple front over the past two years has been about it displaying the stupendous amounts of incompetence and dickishness we're more accustomed to expect from the likes of SONY or Microsoft, without much in the way of redeeming flashes in innovative marketing brilliance that are supposed to distract us from the Cult of Steve's abusive relationship with its followers.

[Full disclosure : I've been using, fixing, hacking and doing mundane and weird stuff with Macs for over 20 years, professionally at times. I'm also familiar with Microsoft platforms since the DOS era, and have ran a few other OSes as daily workhorses, including BSD variants and BeOS. I'm currently sharing my computing time between MacOSX 10.6 (~65%), Android ICS/JB (~20%), WinXP (~5%) and Win7 (~10%). My browsers of choice are Chrome and Firefox (desktop), Opera Mobile and Dolphin (mobile). I don't really fancy long walks on the beach, but I'll happily meet you at the bar afterwards. Market shares % and other figures in this article, unless explictely linked, result from the conflation of various sources, smoothed with my thumb.]



His Jobiness, speaking the truth to his followers.

Of course, much of what can be held against Apple isn't exactly new : the Cupertino firm has always treated its devoted flock like crap, failing to acknowledge the debt it owns to its peculiar ecosystem of loyal customers, employees and third party developers. Only now, it starts to look like the harem of battered wives it has built for itself may think about seeing other people.

For all the self-entitled abuse it imposes upon its users and partners, Apple used to offer a few things no other company did : a comparatively reliable, trustworthy hardware and software environments, access to high-value niche products/customers, plus a strong, if screwed-up brand culture, and enough gems of genuine vision to infuse the whole thing with its unique love-it-or-hate-it flavor. Oh, and also the Mac's cool GUI and (relatively) painless learning curve.
In short, embracing the Apple platform meant paying a steeper price upfront, compensated in uniqueness and consistency (and arguably lower CoO in the mid-term).

It doesn't hurt that Microsoft used to do such a good job of RP'ing the ebil empire, either : going the Mac way was sexy and edgy on a cultural/political level, back when it meant doing the chic rebellious thing (as opposed to the then-loony rebellious thing of running a ghetto OS like Amiga or Linux).
Things have changed quite a bit since, however, and Apple making a killing both in the desktop and mobile markets means it no longer gets to play the underdog card. APPL isn't just valued sky-high, it's also the largest, most profitable single vendor of tablets, smartphones, multimedia handhelds and laptop computers today, which is hardly grounds for anyone to cry over MacOS X still-marginal share of the PC market.

But that's not how Apple sees it, apparently : the with us or against us mentality is still in full force at Cupertino, and it is examplified by the campaign of bullyish lawsuits they've been waging lately, and their generally anti-competitive, gun-crazy stance, all of which have not proven successful enough to vindicate their views in  the public eye. Add to that the Foxconn mess, and the down in perceived value of the Apple platform, thanks to SONY-grade horror stories about Apple's declining Quality Insurance and customer service & support practices, and it's no mystery why aftermarket logo-covering shells for MacBooks are selling so well.

Meanwhile, Microsoft has been regaining some ground in public perception, with Windows 7 proving a usable product, and now Windows 8, for the first time ever, possibly putting MS ahead of the Apple curve in GUI evolution... just when Apple is having its own Vista moment. Thanks to many users stubbornly sticking to MacOS 10.6, because the 10.7-10.8 iOS-inspired releases do nothing for them, the supposedly obsolete OS earned a reprieve.



Apple employees, hard at work to improve the customer experience.

The most glaring example of Apple's worldview drifting towards the pointlessly dickish is probably the 2012 launch of the Lightning power/bus connector : not only does it not add any significative value in terms of features and functionalities over the preceding 30-pin connectors, but it introduces an incompatibility with existing peripherals and accessories (both Apple's and third-party's) that can only be (somewhat) solved by buying overpriced adapters from Apple*.

Much like the Lions-class OSes, Lightning fails to justify its existence to anyone but Apple itself, as it doesn't bring enough extra value relative to the restrictions it imposes to many users and developers to warrant the upgrade, and instead hinges on the certainty that people simply won't have a choice but to go the Apple way. The company is not half-assing it, either, going out of its way to try and stop users and third-party vendors from bridging the compatibility gaps and jump over the garden walls inasmuch as it can.

This walled garden mentality, seemingly vindicated by the success of the iPod/Phone/Pad lines and the iOS Appstore, may turn out to be problematic however, if Apple finds itself facing serious competition on the fronts where it traditionally dominates : ease of use, reliability, and brand prestige.
It's not so much that Apple has been losing its marketing edge, although the overhyped Cult of Steve thingie may be backfiring since the demise of Jobs**, and more about the likes of Google, Samsung or ASUS relentless efforts to best Apple at its own game. Android smartphones and tablets offerings cover a much wider range than Apple's, and no longer fall short in the high-end tier, often matching or exceeding Apple's products in features, looks, polish and ease of use, usually for cheaper.
Meanwhile, the prevalent attitude among Android vendors towards open standards, multiple appstores and interoperability with foreign environments is one of comparative openness : many Android-based hardware makers have opted to no longer interfere with rooting / jailbreaking, and some even go so far as releasing source and documentation to facilitate better support of their devices by third-party devs and the hacker community.

At this writing, Apple keeps the upper hand both on the mobile hardware and OS fronts, and still leads in prestige and brand popularity. Largely because of the plethoric hardware offer on the Android side of things, hardware vendors are reluctant to coordinate and invest in cultivating the Android  brand and customer loyalty, which are likely to benefit their competitors just as much as their own product line.
On the other hand, Google — as the standard-bearer of the Android forces — is doing a fairly good job of addressing the brand dilution issue by presenting a simple, easily grasped hardware lineup in its NEXUS series (lately of LG, Samsung and ASUS making) and is having a field day making Apple look like a sourpuss has-been, despite Apple commanding over half of the entire tablet and smartphone market all by itself.

On the desktop/laptop front, Apple success is relative to where it's coming back from : at roughly 12% of the total PC OS market, vs about 85% for all Windows variants, it's not exactly in a dominant position, even though it's certainly one of the most profitable hardware vendors, thanks to the top-tier position and higher than average margins of its products line (prices range from $1,000 to $4,000). That leaves Apple way behind the HP-Dell duo in market shares (each controls about 22% of the market), but still growing while the others decline, thanks to a much more loyal customer base and to the gateway drug effect of its tablets, smartphones and handhelds, to end up  just as profitable as the leaders — if not more.


Apple products are for thinkers, of thoughts.

Apple's core edge lies in the branding and functional synergy between its PC, OS and mobile offers, which currently can't be matched by Android or Windows competitors. 
Microsoft efforts in the mobile OS market are doomed to fail, as MS simply lacks the chops to compete with an established leader (MS expertise and culture is all about destroying and co-opting smaller competitors, not besting heavyweights in fighting shape at their own game), and conversely, Android lacks a desktop OS counterpart for seamless integration, something neither Apple nor MS is eager to facilitate on their respective OSes.
That's something Apple is banking on, too, and the convergence between recent Mac OSX iterations and iOS, interface-wise, owes nothing to chance or lazyness, and everything to Apple's dedication to persuade customers they should forever remain inside Apple's controlled ecosystem.

This could turn out to be a limiting factor, however, as the distinction between mobile platforms and serious hardware gets blurrier : the tablet market, which Apple singlehandedly brought to what it is today, is well on its way to eat the subnotebook, then laptop, lunches. 
With the desktop (as a consumer product) limping towards extinction, and mobile gizmos crossing some significant thresholds in performance (3D capability, full HD display, HDD-sized storage capacity), the platform war for the mobile-friendliest desktop OS may be over before it really begins, as the very notion of host OS may soon become obsolete, and the next generations of formerly mobile OSes become able to do everything we expect from a desktop computer, by hooking up to compatible peripherals.

And this is where the Apple model may hit a pothole, and its hostility to third party vendors and developers could backfire, from the first moment its marketshare in the mobile market significantly falls below the 50% mark — and it will, inevitably. 
Developers and hardware makers, if faced with the choice of supporting iOSX vs Android, with Apple at equal or lesser potential marketspace, may decide to focus on the platform that's less likely to leave their products dead in the water overnight, be it by revoking their access to the Appstore or changing the standard of a critical physical interface.

*

I'm certainly not predicting Apple's demise here, but a cascade failure of spectacular proportions may lie just around the corner for Cupertino if it doesn't wake up soon to the reality we're no longer in the age of Bondi vs Beige, or iPod vs Zune : Apple can no longer count on the competition's terrible suckiness to make its products shine and look cool in comparison. From anecdotal evidence, I hear from more and more people switching between iOS and Android in the wrong direction (for Apple), and not many the other way around. 

Apple is reaping considerable profits from its mobile/handheld platforms, and the PC business isn't bad, either, especially considering Apple premium-priced, integrated model ensures a comparatively high profit per customer/sale. iOS customers are also spending more on average than their Android counterparts, both in the company Appstore and on Apple and third party accessories. Indeed, Android has a lot of ground to cover before it threatens Apple's lead in revenue, and no individual mobile device vendor can dream of touching Apple as of yet.
Thus, the future looks good for Apple — just not as good as it was before serious competition. Apple still owns the iOS market, but it's simply no longer all the market, and as the iOS expansion slows, it may get closer to saturation, especially if customer loyalty can't be taken for granted anymore.

As Apple has proven numerous times before, it can build great hardware, and software worth paying a premium to run. Now would be the time to get back to that approach, because there's no room left for an iPad Nano to follow the disappointing iPad Mini …it's called an iPhone and everybody who wanted one got one, already.

*

For largely different takes on the issue, I suggest this and this.

~

*[That is, until it got worked around by enterprising aftermarketers, about 5' after release.]
**[If everything good about Apple was of Steve's doing, how good is an Apple without Steve ?]

20121215

Agree to disagree ?


In democracy, politics is the art of misdirection.

Whether you believe in democracy as a virtuous and working system depends strictly upon where you fall, between your urge to be heard, and your distrust of your fellow citizens' ability to make competent political choices.

If getting your voice heard is important enough to overshadow the risk of idiots contributing the majority of the vote, congratulations : you do believe in democracy.

If deep down you're terrified by the idiots having any sort of say in public matters, and realize your own opinions may not be that interesting, congratulations : you don't believe in democracy.
…yet you likely are even more afraid of something else, like communism or dictatorship, and democracy sounds like the lesser of many evils.

Here's a series of well-know facts about us all — they're truisms, but they'll come handy in a minute: 
  • Most people are idiots. 
  • Most people believe most people are idiots. 
  • Most idiots don't know they are. 
  • Most people don't believe they are idiots.


It follows that idiots (who think they aren't) distrust the mental and political competency of most other people (who they believe to be idiots), and yet feel of critical import that their personal opinions are heard and accounted for.



C'mon… you know you want to click on me.


The above may seem like an issue unfairly framed, and in some ways it is.
That's meant as an object lesson : because of the underlying assumption (shared by most) that a significant portion of the citizenry is in fact politically inept (by virtue of diverging from one's own views or interests), any debate in the context of a democracy tends to be framed in such a way as to either exclude, void or redirect the contributions of the people the speaker disagrees with, and favor those views she subscribes to.

I could write a book about the inherent self-contradictions of democracy as a mode of government (don't taunt me), but right now, here's a simple question for the presumed idiots out there, and also for you, my dear reader :

Can you imagine a working democratic government system that wouldn't hinge on misdirection ?


~

20121208

The Bourne Legacy - A Vicarious Postmortem


The other day I almost-reviewed — mostly spoiled and snarked at — the fourth movie in the "Bourne Whatever" franchise, and hinted there was more to come…
Well, here it is.

Today's post is about learning lessons from that wreck, and about the abilities and constraints applicable to spinoffs and expansions in a series.

Also it's a pic-less wall'o'text, because looking for iconography is too much of a timesink, and I'd rather get back to the next episode of The Fair Game series. Deal with it.

*

To start off the right foot, let me state for the record, The Bourne Legacy, as it is, is not a bad movie entirely, and falls from only as high, to fail only as hard as it set itself in the first place. What goes wrong mainly happens on the expectations management and internal consistency fronts, two topics dear to my heart.

As may not be obvious from my previous spoilerific review, the first 90-or-so minutes of the movie stand on a firm leg, and really do a quite honorable job of upholding the franchise style and character, that is until the flick suddenly takes the kind of unexpected turn in both style and substance that manages to ruin both everything that has come before and make the — otherwise pretty good — pulp-action in the last  half-hour fall on the painful side of campy.

That sort of slip-up is especially inexcusable and damaging when applied to a franchise that's entirely about taking a fresh and clever view on otherwise tired genre clichés, and we'll get back to the question of why and how, but for now let's just say the end of the end of the movie was just about as welcome as a strawberry icecream floater lobbed into a garlic clam chowder bowl.

Let's have at it, in from-the-hip postmortem style.


What went right: 

In the absence of the titular character, the writers made the right choice by setting up The Bourne Legacy as a side-story rather than a sequel : this allows for tighter integration of the new story into the mainline storyline and instantly establishes the legitimacy of the expansion as part of the larger verse.
Similarly, most of the defining elements and themes of the early Jason Bourne movies are reused, but not so hamfistedly that it looks like a reboot by any other name. 

A particularly nice (if under-exploited) touch comes with the theme of the anti-captain america supersoldier, who owes not so much his physical abilities but his intellectual competency to being a military guinea pig, and fears a demotion to his previous state or metal retardation. It's a rather clever play on the classic allow me to show you my true worth by granting me superpowers, because the protagonist himself believes he isn't worth crap without his spook-special makeover.

Aaron Cross is slightly less of a wandering hero than Jason Bourne, but even though most of the plot is resolved over only three locales, mostly urban, we still get some sense of moving around quite a bit and all the right checkboxes get crossed in terms of scenes that should be in a Jason Bourne movie : the roof chase, the weird/tense meet with a peer agent that ends up in dead peer, the house assault, the epic car chase in economy-class cars/bikes, a bit of improvised weaponry… the works. 

The main cast is generally a success : Jeremy Renner works as a footsoldier upgraded way above his comfort zone, in a bit of clever metaplay on the actor's public perception as a second-tier star, and (nearly) gets a chance to grow up throughout the movie into a lead in his own right. Edward Norton is also a nice pick for the half burned out post-facto rationalizing baddie, and Rachel Weisz sells a surprisingly well-written fish-out-of-fishbowl scientist girl. 
In the extras/red-shirt department, there are a couple nice touches too, with Zeljko Ivanek and Elizabeth Marvel, both of TV fame as morally ambiguous typecasts, who make the best from the small parts they're given. Some of the cast from previous installments reprise their roles without fault, which helps bring the whole thing together with the main universe and storyline.

The principal photography is not bad either, and generally fits alright with the atmosphere and codes established in the previous movies, if a bit different, but that's not a bad thing in itself.


What went wrong:

As subtly hinted in my previous review-cum-spoiler, the entire last quarter of the movie goes horribly wrong, not because it's poorly done (although at times it is), but because it runs off the rails from the entire franchise by moving into over-the-top pulp bombasticness for no apparent reason but lazyness. 
I have other gripes with the movie, but they also boil down to lazyness : Stacy Keach is appropriately hamming it as a generic cynical military-turned-conspiracy underboss — but sadly, it's entirely out of place in this specific setting. That kind of characterization is just phoning it in, writing wise, and so are most of the other minor blunders in the movie, so let's focus on the core issues of lack of direction, consistency and proper storytelling.

Where the entire Bourne franchise hinges on stretching believability while keeping it hollywood-gritty and reasonably verisimilar, the Bourne Legacy drifts into a pure pulp from the moment the main arc for the hero is resolved with a fetch quest of a magic cure for his illness/Achilles' heel. At this point, there is no story left in this beast, and the whole thing is runing on fumes, so we're treated to a very intense and spectacular chase against a terminator-like baddie that goes on for the last 20 minutes of action in the movie, yet fails miserably because a) it lacks purpose, and b) it's stylistically off.

The point is… ?

From a simple storytelling perspective, the lack of purpose is obviously at odds with the supposedly climactic nature of the big finale chase and confrontation with the baddie, but let's see exactly how and why, because that's a common issue with some action flicks that endeavour to carry more of a story than a dungeon crawler.

The Bourne series is not about a protagonist kicking the arse of incrementally tougher henchmen until he gets to kill the bigbad. It's about a guy travelling around a world of trouble, in a quest for answers, and hopefully peace of mind — and breaking a few skulls if and when it's the most convenient solution to get baddies out of his way. The face-off with the dragon happens because the dragon is sent after him, not because the protagonist works his way up to to the confrontation as a milestone.

Arguably that's the case in Legacy, too, as the determinator sent after Bourne Cross is not somebody Cross looks forward to meet and fight. The problem here is there's nothing at stake : the dragon is an obvious case of last-minute release monster X and his obstruction accomplishes nothing but to slow down the hero's eventual escape from his pursuers, which we know must by law of the genre succeed - the only question here would be whether the girl is to get killed en route, but by that point we already trust she won't, for she's been sticking with him past the point of usefulness, and she's pretty much all he has to show in the way of spoils of war.

The big car chase is supposed to be a serious mountain to climb during the journey, ideally culminating in some flag capture or equivalent milestone in the storyline : in the original movie, it's the clincher to bond Jason and Marie into a team, in the second, it's the pilgrimage en route to Bourne's confession and contrition to the daughter of one of his victims, in the third, it's the opportunity for Bourne to spare the life of a fellow operative at the end of a serious confrontation, who will repay him later by not shooting him when he could. In any case, there is more to come, and the chase happens as Bourne is intent on getting somewhere, not merely escaping, the latter we know he could manage at any time, because duh, he's Jason Bourne.

Yes, this make your ass look huge.

That's not even the worst part about the final chase : style is the main offender here. Simply put, it doesn't even try to be believable and it's a mess to boot.

When your car/bike chase is so much of a collage of nonsensical show-off tricks that it would be hard to turn into a playable mission in GTA, that should be a solid hint something is wrong with it. Much worse is the fact it just doesn't fit the overall style of the series and manages to make a mockery of everything that came before (and wasn't an embarrassment) by throwing away the Bourne franchise earnest efforts of keeping things mostly realistic (for a blockbuster value thereof) and switching to an action-comedy gear we didn't know was supposed to be there.

There's nothing wrong with action comedy, or Hollwood-amped pulp/B-movie action : I've enjoyed some of Jason Statham and Vin Diesel action flicks without remorse or shame, and I can even go through any of Tom Cruise's ridiculous MI without gouging my eyes out with a spoon. but the moment you throw a trenchcoat on a ninja-zombie and pretend it fits into a John LeCarré spy novel, you'll see me cringe.

But let's backtrack to the true moment of derailment, and what it indicates about the true wtf is it you think you're doing ? aspects of The Bourne Legacy.

The Bourne series is a fairly classic lone-spy setting, and revolves around a guy using his talents against his masters to both survive and unveil conspiracies after his being marked for termination. Without going into the minutiae of the franchise, that's the bare essentials.
Where it differs from a pure action flick is in the rather involved story, with multiple reveals, twists and concurrent interests of major players/groups, and a protagonist that is expected to outsmart them all as he's peeling the onion of lies and coverups, all the while struggling with his not-entirely reliable mind. No such thing here, even though everything was properly set from the beginning.

There is only one conspiracy at play in Legacy, as you can safely ignore most of the mainline Bourne stuff happening in the background : bad guys are worried about being exposed and initiate a wholesale cleanup of all the people involved in the project the hero is a part of, flagging him and scientist girl for termination. The mission therefore, is to go in hiding forever after grabbing a MacGuffin to take care of the hero's addiction to pills.
Thus we're looking at a fairly straight line, with a single waypoint enroute, laid on a flat surface, as opposed to the aforementioned conspirationist onion of the previous Bourne movies. The endgoal is equal to the initial starting condition, with a slight detour. That may be OK for a straight-up action movie, but a Bourne it does not make, legacy or otherwise.


What could have been done differently ?

We start with an interesting protagonist, who's more the grunt-next-door and less the elite-spook type than Jason Bourne, and who comes with a strong personal motivation : more than his life, which his type is expected to be prepared to risk at every turn, he's running against the clock to save his mind, which he knows only exists thanks to the chemical enhancements provided by his former masters turned enemies. Much like the original Bourne was driven by his compulsion to learn about himself, Cross is forced to get in the thick of things in order to not lose himself, and everything is primed for us to get a taste for the looming curse that's chasing him forward… except it will never happen.
We get a glimpse of what could possibly be Cross blanking as the meds start running out, shortly before he reaches the MacGuffin in Manila, but he gets injected with the magic cure for stupidness momentarily, and we never see him in his diminished state (he shakes a cold-like fever overnight, during which he has bad dreams, wow).

That's too bad, considering we also have Dr Marta Shearing on call, who it is established is very competent in her field, and has proven to be quite smart and resilient generally. 
We'd have ample room here to work in a couple scenes right around or after grabbing the virus sample, where she'd get to carry the ball (and our hero) forward, while his abilities diminish to a rather terrifying point. We know of his potential value to her in his fully-able state, and that would be reason enough for her to put in the effort and risk to try and restore him, while it would go a long way toward explaining why he doesn't dump her the moment he is permanently fixed and she becomes dead weight (OK, we got hints of his good heart and possible crush on her earlier, but still). 

All of that could have been sorted just fine by the magic cure not being such a turnkey solution, and requiring a bit of time, work and tools.
That would have entailed a bit of respite for our heroes, which could have been brought about by their fortuitous "death" occuring after…  a car chase ! 
There are ten different ways to make this work, thanks to the semi-random nature of the chase context : slow down or minimize the chasing party at will, for a minute or more, get the girl behind the wheel because the hero ate a bit of a scenery or just went retrograde, etc.
This also would have given meaning and purpose to the mandatory car chase, and who nows, offered a bit of an opportunity to actually care about the determinator, too, provided he did not die right away — maybe he can't believe they're dead because of his built-in "amped mission fidelity", and thus keeps looking for them after they're declared dead, who knows…

*

This pace change alone would have solved two out of three problems, provided the chase was not so ludicrously designed (but that's a self-contained issue), and would have added a bit of depth to the whole thing, leaving only the conspiracy side of things a bit light for the price.
…which is convenient, because we might want something to wrap things up, even though reaching the point where the hero recovers his full cognitive abilities could be enough to provide a satisfying conclusion. Of course, that would leave the girl in charge of all the meaningful parts until the titles roll, which in itself could seriously screw with the alpha-male mythos here (although I personally would deem it a nice twist, and argue her doing the dragon in at the end of the bike chase was indeed a step in that direction).

Being an expansion and side-story to the Bourne mainline, there are limitations to what can be allowed, conspiracy-wise : Legacy couldn't afford to break the canonical storyline, nor kill any of the recurring bigbads, and generally had to avoid affecting or triggering verse-shattering events, for fear of closing plot branches and making things harder for the future of the franchise. Still, I'm confident Ed Norton's character is not above using child soldiers in african wars as a test group for new supersoldier juice or something equally disturbing, now that the whole Bournery thing is canonically commoditized in pill form.
How you get Aaron Cross in the loop, or to care about it, is another question, and one that would have been fitting of another installment in the Legacy subseries.

ttfn

~


20121205

The Bourne Legacy - as if…


Between sourcing my research by tripping on peyote while playtesting games old enough to get their own passport, then editing my ramblings before poasting*, not to mention all the stuff that happens off-stage, this editor sometimes needs a break, and last night I watched a movie…


(IMDB 6.8, RT 56% , MC 61%)



Fair warning — for reasons that will soon become apparent, I'm going to spoil the socks off this mother, so if you plan on watching this flick and are keen to preserve your sense of wonder, here's all you need to know — it will be good, while it lasts.

*

If you somehow managed to never see a Jason Bourne movie, here's what you need to know in addition: Jason Bourne is a super-spy, and a wanted man.


Previously on…

The Bourne Identity (Matt Damon is Jason Bourne)
In episode one, he starts as a bullethole ridden amnesiac who soon discovers he has superhuman skills and more passports than fingers (counting the toes). He subsequently goes on a journey of self-discovery across europe, taking along a cute bohemian hipsterette and riding in her mini, all the while chased by people who don't want him to remember. It sounds like it would suck in all the usual boombastic ways, yet really is both good fun and surprisingly verisimilar and low-key (for Hollywood values thereof).

The Bourne Supremacy (They should have left him alone) 
In the second installment, our now-retired protagonist is framed by the baddies for the death of several CIA agents, sending the CIA after him. Just to cover every base, the baddies still send an assassing after him (as if) with the net result that the girl eats a bullet, thus sending the hero into a revenge frenzy. If it starts to sound like a Liam Neeson movie, it's because it kinda drifts towards that precipice, but manages to avoid the cliff jump by hanging on a solid thread of _noir_ spy intrigue and deviousness (some cool baddies, too).

The Bourne Ultimatum (This summer Jason Bourne comes home)
Part three is arguably where the series comes into its own, our by now soulcrushed hero is dead set on a course of revenge and redemption, with a side of death wish that gives some consistency to his going against all odds, _fear the one who has nothing to lose_-style. It pulls itself above the tarpit of tired cliché'dness through a fairly clever and earnest treatment and some good acting and direction.

…which brings us to The Bourne Legacy (There was never just one).
A hell of a misnomer of a title to boot, as it takes place concurrently, and not after the run of the original trilogy as you'd expect.** 
The legacy part appears to be a case of nobody remembering to change the working title in time and getting stuck with it, or they just didn't care, who knows. What it really tells you is how the production approached the problem when confronted with the absence of both the titular character/actor and the director who fleshed him out.

Back to basics.

The Bourne legacy is a Jason Bourne movie without Jason Bourne, and Tony Gilroy seemingly decided to go back to the original recipe, and do it all over again, with a slight change in flavoring oil, and while keeping the option open to carry on with the original product line in the future.

What it came down to is this : refresh a tired trope with clever treatment.
In this case the tired trope-a-looza is made of : 
  • plentypotent field agent is chased by baddies from a government spook program gone feral, 
  • is not too sure he can trust his own mind, 
  • and takes a cute but-not-Disney-cute brunette along for the ride, 
  • because she's a witness and the baddies want her dead too, now. 
Also part of the deal, srs bzns car chases using economy-class vehicles, some parkour-ish rooftop acrobatics, improvised weapons, travel to exotic-yet-gritty locales, and a face-off with a dragon of similar background and skills to the hero's.

As pointed above, pulling off the Bourne-sans-Bourne trick is all about tweaking the flavour and treatment, and the extra ingredients here are one dash of supersoldier sauce, a pinch of extra trope-metagaming with a lancer-to-hero tour de main, and one part moar peppa ! in the shape of extra-crippling mental condition.

So, does it work ?
Yes but — quite literally — only up to a point.
See why after the jump.

20121203

The Fair Game, Part 4 - Mores, morals and morale.


When talking videogames, and upon approaching anything remotely related to morality, one must tread carefully — there be landmines everywhere, but in the quicksands.

I order to to keep the odds of making it through in one piece ever so slightly in the positive range, here's a short list of what I will emphatically not discuss today :
Morality within videogames : or how morality issues are portrayed and tackled in videogames. 
Moral teachings of videogames : what are the implicit or explicit moral teachings (if any) that can be channeled through design and gameplay, and what should we make of that. 
Morality of videogames : somewhat connected to the two questions above is the (less interesting) one of whether videogames should(n't) touch on some topics/settings that may be morally objectionable to some, and why.*
Sticking to the Fair Game angle, my primary focus will be instead on current gamebiz mores, on the morality issues at hand in gamemaking as a trade and craft, how it relates to the morale and morals of gamedevs and players alike, and why it should be a defining aspect of a Fair Game studio's corporate policies, practice and culture.




There's no business like show business…

Few industries are fluffy bunny happy places, certainly ; in most people's mind suckiness is even sort of a defining element of what makes a job — or you wouldn't get paid for it. 

There are exceptions of course : some people land jobs they'd be willing to do regardless of compensation as long as they can manage, because that's the very thing they love to do (think artists, athletes, many other trades that result from a hobby or spawn one), and sometimes, simply landing the job is its own reward and the chance to achieve otherwise unrealistic ambitions (think astronauts, Formula 1 pilots, key players in large goal-oriented teams).

Although these may be the exception rather than the norm (why it is so reaches beyond the scope of this article), the outlier cases of the jobs (some) people really want regardless of pay help us outline what makes a job worth doing, and why it even exists in the first place : desirability.
The odds of a job's existence are proportional to how badly everybody or somebody want it to happen, relative to its resources requirements. Whatever else comes attached to a job, be it profitability and monetary rewards, or prestige and reputation, or power — all are simply mitigating or accelerating factors in balancing that equation.
In the beginning, it all starts with somebody who badly wants to do something, or somebody who badly wants it done — and sometimes both. 

Like most commercial arts, videogames are built on a foundation of people with a strong desire to do something, namely play and also make games, to a point where it's often impossible to disentangle both motivations at the individual level. Most composers and songwriters start as musicians, and eventually come around to write the music and lyrics they crave to play, much like most writers begin as avid readers, and moviemakers are movie buffs themselves more often than not.
Past the first generation in their art, creators typically grow up as fans (which can be a mixed blessing, artistically speaking, but that's another issue), and those who decide to make a career and a living from it do so out of love for the medium, and — especially in hybrid arts — count themselves lucky just to get there and be a part of it

Were the entire game industry populated with avid gamers who make the kind of games they love for kin-spirited players, all would arguably be for the best… in fantasyland. 
Despite the bad rap I've given until now to publishers in this series (and I'm not quite through yet), the fact we have an industry at all should largely go to the credit of early publishers who sometimes acted as enlightened patrons of the arts, true believers and enthusiasts, yet with enough of an eye on the bottom line to keep things rolling and snowballing. It's they who enabled gamemaking to graduate from garage industry to heavy iron, to take on projects that would have remained out of reach without the resources and scope enabled by a broader market reach, and who led to an economy where videogames and electronic entertainment are now the largest driving force for innovation and development both of hardware and software at large.

Granted, most among those key early publishers were often gamers themselves, and cared just as much about making great games for their own sake than about balancing the books, which isn't necessarily true anymore. Still, what they did back then is exactly what a Fair Game self-publishing studio should be about, accounting for two significant changes : games are now a trillion dollar, no-longer-cottage, industry, and studios are no longer first-gen'ers on uncharted waters — as such, they should heed the lessons of history so as to prevent it from stammering too badly.



Breeding love slaves.

The past twenty years have been about games getting bigger, more expensive, and on the whole more profitable for all but gamedevs themselves. Although a handful of studio-founders gamedevs got seriously rich, the average salaries and benefits for skilled labor and talent have notoriously failed to get on par with other media and mass entertainment industries, while people in marketing, business and legal positions have reaped the largest monetary rewards.
Although things seem to slightly improve lately with people in development and production across the board seeing raises, and some rebalancing of salaries in favor of new recruits, the publisher-driven part of the industry (which is still the largest employment pool) remains primarily focused on making shareholders and executives rich, and regards concessions made to rewarding work, skill and talent in development and production as the regrettably inevitable costs of doing bussiness.

Comparisons abound in the commercial arts and hobby industries to show how consumer markets built on activities people are willing to undertake out of passion rather than strict utility, profitability or convenience make for great exploitative opportunities for the less morally constrained entrepreneurs : workers practically beg to get shafted. 
There are more people with an actual paying job in the business of catering to the needs of aspiring professional musicians or actors (agents, publicists, coaches, publishers and music/AV prosumer gear) than people who make a living from acting or playing music alone.  And that's not factoring the larger entertainment industries' free or underpaid workforce of volunteers, interns, assistants or on-trial "juniors", who essentially pay for the privilege of doing the most crappy jobs less motivated temps wouldn't accept (for the price).

In effect, mass-market hobby industries have enabled the democratization of old practices previously only known to businesses that catered exclusively to the rich and bored, like sailing for fun or breeding race horses : a rich man's hobby, and a poor man's job.

While the games business never quite got there in terms of abusing aspiring gamemakers by selling them coaching and middlemen services**, it's proven quick on the uptake when it comes to making the most of its workforce eagerness to join the party. Un-or-barely-paid internships abound, as do volunteer programms, and while lack of job security is accepted as par for the course in a boom'n'bust project-centric business model (which is that of most studios), less-than-stellar workplace environment and sometimes downright abusive labor conditions are also frequently tolerated by employees who see it as the normal of the industry, perpetuated by many veterans who "got through it, too" and see it as paying one's dues.



Do unto others…

Behind the conservative business common sense argument that working your employees to death for shit pay may not be the smartest course of action in an increasingly competitive environment where loyalty of both customers and employees rises in value, and slow and steady may prove to be the best policy for indie studios, there is an obvious moral one. 

Fortunately, and for once, morals aren't necessarily at odds with business interests : not only is treating employees poorly morally dubious, it creates a vicious feedback loop where people who don't feel valued are less likely to value their own work, the product, the company, and ultimately the customer, which — thanks to some cognitive bias I don't know the name for — they come to regard as both a sucker (for eating up the crap shoveled their way) and the enemy (because customers are indicted as the ultimate cause of gamedevs' predicament).
Obviously, bitter devs could just as easily hate on the suits instead, and they do, yet one can only focus on that for so long before figuring gamedevs are the sheeple being sheared, while blaming the customer here is enough of a non sequitur that it's less likely to be questioned : you can't argue with crazy. 

So here we are, with an industry that is as good at chewing the naive and spitting bitter, broken shells of burned out talent as if it had been designed to do just that, from middle to top and bottom. Executives are comforted in their certainty that suckers are meant to be shafted by their ever-increasing salaries and bonuses, while gamedevs have long learned the lesson that loud beats imaginative every time and is easier to replicate. As for gamers, they're trained from infancy to love quantity (in explosions, polycount, play length, content and players) rather than quality (of challenge, gameplay, mechanics, story, people), with the result of their ever more fleeting commitment to a given title or brand, never sated hunger for something more (which they can't even see really means something better), and ever growing frustration and impatience with the HFCS-infused gruel they can't stop themselves from gulping, even though it stopped being pleasurable sometimes around last century.
  
That's how we got from thinking good games were a good enough reason to do it, to making bad games for no good reason, and how we segue from the issue of morals to that of morale.



Mutant Force assemble !

As hinted above, harnessing enthusiasm into churning doo-doo by the truckload under slave labor conditions works only until depletion of the existing reserves of enthusiasm, since there is little to feed back positively towards the slaves and resplenish their giddy glands. So far, the industry has managed to make do and thrive nonetheless, terminating in droves those that start to shoot mostly bile and replacing them with freshly hatched noobs. 
Nowhere is it more true than in the MMO side of the industry, where it applies equally well to players and gamedevs, and is also where we can observe the comparatively long term impact of this way of running business.

[For the next few paragraphs, I'll speak with MMO gamedevs and players in mind, because of the remarkable similarity of their conditions and experiences with the business  of making or playing games, both in terms of initial enthusiasm and commitment, disenfranchisement and disenchantment, burn out and eventual recovery or termination as gamers and gamemakers.]

As more and more people people — players and gamedev alike — individually go through a transformation from bright-eyed n00b to jaded bittervet at increasingly fast rates, they also develop immunities to entire genres and companies, looking for the traps before drooling for the bacon, and as they pass that sad wisdom on to their younger peers, and help each other getting out***, they slowly inoculate a growing portion of the herd against the attraction of exploitative gaming practices.

What becomes of these gamers and gamedevs, seemingly lost to the industry ? 
Some certainly are burned out beyond return, and the mere idea of sitting in front of a computer game triggers PTSD-like flashes of angst and codependant stress in their crippled minds, ensuring they'll never go near anything remotely social in gaming (if not at large) again, and they'll keep their gaming to pointedly casual genres, if any.
Some, probably most, eventually come to the point they make peace with the "not worth it" aspect of the whole thing, and while they keep a vested interest in the medium, they know full well why they don't want to return for seconds, at least until something changes radically to bring the offer closer to what they could deem tolerable.

Finally, a fraction of players and gamedev have simply learned a different lesson, which is that the stuff they so badly want simply isn't to be found on the shelves of big boxes retailers, brick'n'mortar or online, and that it's up to them to go out and either find it, or make it happen.
Because these people, as a group, are self-selecting to be both experienced and discerning gamers and gamedevs, they're in a better position than most to recognize and value all the same qualities big name publishing is lacking, and when they find each other, to realize there's enough of them around to make it worth building games that cater to that otherwise forgone playerbase.

If anyone asks, that's where the "indie" craze comes from : the higher morale that comes from having morals.


As the industry matures, and the public for videogames has long reached beyond the proverbial teenage pimpled male to go after the juicy targets of… well pretty much everyone, the question of moral obligation, or at least accountability becomes harder and harder to dodge. People grow up with videogames like they grew up with TV a generation ago, and likewise are expected to keep playing through their adult life, while they raise their kids to be the third generation of videogamers. 

As gamers and gamedevs both are increasingly aware, gaming has become a major part of our global culture, and the games we pick and play not only tell something about us, but contribute to inform our thoughts, sensibilities and outlooks, just like books, movies and TV do, only more efficiently, as they engage directly, and more and more frequently ask us to physically commit to the experience (think new game peripherals and ARG just for the most obvious), thus helping the messages they convey to sink faster and deeper into our minds.

It is only right that we've come to question the morality of the amoralist stance which used to be the party line among gamedevs on the grounds that it's only games, and don't leave that question to the bigots, Luddites and opportunists anymore. Gamers don't turn into reactionary idiots just because they have kids, they simply lose the privilege to evade serious questions the industry has for too long feared to confront, and which we should, gamers and gamedevs alike be eager to explore, as it leads to a better understanding of our medium of choice.

Opening this post, I promised I wouldn't go near morality in games, and I'm steering dangerously close now, by touching on the moral responsibility that comes with handling powerful machinery around trusting people… That's as far as I go today, and only to mention how, as a group, the gamedevs and gamers that are going out of their way to find and make different games are increasingly self-aware and thoughtful in their practices.

Anyone who's being a cynic when going after that market better be as smart as they think they are, or they're in for a surprise : unless they target objectivist useful idiots, they may be oustmarted by their marks before getting around to scam a penny in funding from that crowd.

…which reminds me : next installment will focus on Fair Game monies, where to find them, how to use then, and why you don't want to waste precious doubloons on booth babes, these days. 

ttfn

~


* [Quick answer to that one, because it's easy : I believe everything is fair game, in principle. What is OK or not in practical terms is a matter of good or poor taste and falls under artistic discretion, to be weighed against how it can affect the business end of things — for better or worse.]

**[Although gamebiz-oriented schools and classes are sprouting left and right, many of those are actually teaching no-less valuable skills than what you'd learn by mastering in fine arts to the end of becoming a game artist, so why not.]

***[Informal therapy groups of in-game/work friends who collectively help each other to move out and on from their shared grounds of sorrow are both heartwarming and rather depressing things to witness]