Tuesday, November 15, 2011

The Entropy of Morality

I've had plenty of thoughts about "moral relativism", mostly that it's pseudo-intellectual garbage. I haven't bothered post exactly why I think that, but suffice it to say that it talks about everything as if its reducible to first order logic. If we can't (I mean literally can't) fully formalize math, then why the bloody hell would we expect to be able to do it with ETHICS?

But I thought of another reason why I believe morality exists. Because true morality is hard. I'm not talking about sparing someone's feelings when they ask you if they've gained weight or refraining from committing a crime. These are the easy things; social and legal forces are very overreaching and keep us in line. No, I'm talking about the hard choices. Putting yourself in physical danger because it's the right thing to do, living ascetically to escape the corruption of material wealth, quitting your job because of an ethical situation that you have little part in (like say... selling advertisements for cigarettes or dangerous pharmaceuticals that you yourself would never use.) Acts of serious courage and sacrifice are things you don't see every day; I myself confess that I've never made any risk or sacrifice great enough to stop questioning my own virtue.

What does this have to do with anything? Isn't that arbitrary? Well, no. Saying that something is hard is a way of saying that it means something. Cowardice is a dime a dozen. If I make a choice between an act of courage and an act of cowardice, it's not like both are decisions with equal weight. One is easy and common, the other rare and difficult; so it must mean more—just like it's more meaningful for me to write these words on paper than to just punch random unintelligible keys; or to solve a difficult math problem instead of make an unsubstantiated conjecture about something based on nothing.

In other words, truly moral actions will have greater information content. They prove something, even if that something may be "arbitrary", just like math; a subject in which we still value proofs (for good reason.) Just as a game means nothing if there are no rules (with the exception of Calvinball), an action means nothing without difficulty. Cowardice, greed and hatred, on the other hand, never prove anything. There's no test, no rules and no meaning.

Tuesday, November 8, 2011

Paradigms vs. Narratives

I think I have a simple definition of a narrative:

A paradigm is a sign system that is closed.

A narrative is a sign system that is open.

This also easily explains the difference between game design and interactive storytelling (to an extent; you can think of it as a spectrum, chess is on the "game" extreme whereas something like SimCity or Final Fantasy is a lot closer to the middle. Nothing yet on the "interactive storytelling" extreme.)


I'll elaborate more on this soon and also talk about how this is shaping my work.

For that matter I decided I'm going to also be a lot more open about my work. Scrooges go nowhere.

Friday, September 30, 2011

Command and Control

Why do we overeat and refuse to burn it off? Why do we procrastinate; or even burn out despite our passion for a subject? What about the state of the world? Why are politicians so disappointing; what should we trust in when we vote for them—rigid rationalism, protocol; or the feeling deep within our gut? What about the problems that we just think are never going to end—is it willpower, self-experimentation or something else?

I've benefited a lot in recent months by letting go of reliance on two sides of the same coin—declarative knowledge and conscious "willpower." Both are important and play a role in many things, but they are the exception rather than the rule. The way I came to this was through, yes, some changes to how I ate and exercised. I've never had a weight problem, but I have some other reasons, such as asthma, to look after what goes in my body.

Upon investigating these ideas, I saw that there was a general flaw in the arguments about obesity. Moralizing self-help gurus have convinced us everyday laymen have taken the truism that people gain weight because they "consume more energy than they expend" and somehow used some discursive sleight of hand to translate this to the idea that obese people are being weak-willed by overeating and not exercising. To quote one great thinker: what horseshit!

Have we learned nothing from the mistakes of Descartes or the piles of research on the body's effect on the brain? I could even wager that thousands of years of wisdom would back this; but I really don't know and could be wrong about that. We are deeply connected to our bodies; they are the source of emotions, which are by extension the source of our decision making. Don't believe me? How about the fact that even with tons of analytic knowledge ("facts"), severing the connection between our brains and our bodily dispositions disables our ability to effectively make decisions. But I digress.

The reason why this "calories-in/calories-out" argument doesn't work is that our behavior is affected by all the biochemical reactions that are going on in our metabolism. When we eat too much junk food too regularly, our metabolism breaks down and our ability to efficiently process nutrients falls apart; insulin surges through the body at unhealthy rates, we store perfectly good nutrients as body fat and our brain does not get the glucose it needs to stop yelling "I'm hungry!" The result is that we find ourselves crashing and needing more food. So perhaps we just need to burn it off? Unlikely; where would you get all the energy needed if you can't even feel satiated?

Of course, all this could be wrong too; remember that bit about syllogistic knowledge. But I am convinced that these systems work something like this—the world is just too connected and nonlinear. And for this reason I've started to wonder, in light of just how embodied our decisions are; whether we need a whole new way of thinking about how to do things. When we procrastinate, it's likely that our mind is telling us that it doesn't approve of the plan. But couldn't that be irrational? Well, yes; we're not always cut out for the modern world that we've set up for ourselves—but the wisdom of emotions seems to be seriously underestimated. Our emotions really may be what's right when we're procrastinating; an unrealistic plan really can slow us down after all.

And how surprising is this when you look at something bigger like the markets? Sometimes we like to give credit to politicans for saving or ruining the economy; but how much say do they really have? What effect does the average tax policy or stimulus plan have? I'd argue that sometimes they work; sometimes they provide a big enough jolt to shift things into another equilibrium—but that's a big maybe; the fact is, we can't even seem to predict where the markets go.

What does that tell us? That yes, there are things we can do. We have some conscious control over our choices and we certainly have times where we just need to stop making excuses and apply elbow grease. But the parallels become even more striking when we hear about the limits to our willpower; how they can be increased through certain behaviors and states of health—but those behaviors and states must be brought about in some way to begin with. We have a very fragile modicum of control over a system that is highly belligerent and, on top of that (though somewhat related), quite random. We need to understand and appreciate these systems for what they are; not only economics and nutrition, but also our habits, the social interactions of people; and of course the very stochasticity of life.

To do that, we must let go of the command and control model but not think that this is some key to "hijacking" our behavior. We talk about tricks to fool ourselves, but it's hard to believe that we can really just "trick" ourselves into doing everything. Nothing is going gently nudge me to do 8 hours of nonstop work (if only...) No, not a chance. There's something more than that. We need to look at the equilibria of our life; on the harm of deterministic thinking and the benefits of random events; on the anti-fragility behind growth and discovery.

It's hard to say for sure what it all means, but a lot of things have helped me towards this path. Eating and exercising more stochastically, cutting out certain food groups but not worrying if I cheat once in a while; not pushing the serotonin buttons of e-mail and Facebook first thing in the morning (getting bad at that), finding peace and discovery in fiction, walking, meditation and unscheduled activities; not adhering to rules too tightly.

But I'd hardly say that I read fiction for "knowledge" or take walks for "exercise"—more accurately I exercise to learn and read for the sake of my body. Exercise has a specific scent of discovery to it, it resides at the core of so many things in our hunter-gatherer past; reading makes me feel more whole, my body relaxes into a new state—just as well that I run anaerobically and weight-lift aerobically.

Maybe you agree, maybe you disagree. If you agree, how have you applied this to your life? If you disagree, you can have a say too ;)

Monday, September 26, 2011

Truth

Truth (capital "T" intended) exists.

But there are no such things as platonic forms. Only forms that we imagine.

We create forms and change them when we find that they cannot adequately describe the essence that we're attempting to capture.

It's in that moment of change, that flux between isomorphisms, that we see Truth. It's an aether that can only be seen via flux.

And that may be all that reality is; flux.

Tuesday, September 13, 2011

Meta-Entrepreneurship

I'm honestly hoping to get this blog back to being a regular thing. The past few weeks were pretty hectic, but now that we're finally getting through a long slog, I think I'll be able to start posting more.

Fear of Software should have a new demo up soon, btw, so keep a lookout for that.


There are a lot of things that I'd like to back-post about, but one concept recently came to mind that is not the most original but may well be overlooked. I called it "Meta-Entrepreneurship", and it came from the thought that the source of all entrepreneurship is problems. Now everybody knows this; every single book asks "what is the customer pain?"

And yet, everyone says that they don't have an idea. On second inspection, this claim is absurd. You're pretty much saying "I don't know of any problems." But this is silly; we cite problems every day as an excuse for not getting something done. In fact, this goes doubly for entrepreneurs. We do so many things that we seem to end up dealing with astronomical amounts of inconvenience on a daily basis. Even a fraction of our problems should be more than enough to give us new problems to solve and make some money from (or maybe change the world, if you're into that sort of thing.)

Now, of course, we all have our domains. I can't just go creating a new branch of my company that deals with the fact that I don't have a dishwasher in this house (or maybe I can); but there are plenty of inconveniences that must have some relation to our domain that if we set our minds to it, we can not only make our own processes faster but create new routes for success and profitability. That is, there's so much we can do if only we'd get a little bit more Meta about it.

I think there's also one other thing to take from this, however; and that's that if you're an entrepreneur, you shouldn't be blaming anything on inconveniences. Yes, some of them really are just that; but the whole spirit of entrepreneurship is capitalizing on the fact that there's a problem to be solved. If problems are such a, well, problem, then how can you convince yourself, let alone other people, that you're ready to solve a major one. As entrepreneurs, we have to go beyond the cliches and actually learn to love problems in every shape and size; not just the ones that we've cited to support some vision that we want to manifest:

That which rules within, when it is according to nature, is so affected with respect to the events which happen, that it always easily adapts itself to that which is and is presented to it. For it requires no definite material, but it moves towards its purpose, under certain conditions however; and it makes a material for itself out of that which opposes it, as fire lays hold of what falls into it, by which a small light would have been extinguished: but when the fire is strong, it soon appropriates to itself the matter which is heaped on it, and consumes it, and rises higher by means of this very material.

If we can do that, then every problem that we encounter, whether our own or someone else's, will be another fresh start.

Wednesday, August 17, 2011

Erudition and Innovation



My confession is that I'm writing this post in preparation for a lecture I'll be giving in a few days to the AIESEC conference. I figured what better way to help jog my mind than present the general gist of what I'm thinking to my viewers.

So without further adieu, what will I be talking about on this post? Something that I think hasn't been paid close enough attention: the usefulness necessity of broad erudition and its role in innovation. So let's begin.



My story is one that some readers of this blog probably know, but if there are any new readers out there, perhaps not. It was half a year ago that I found myself signing on as the co-founder of Fear of Software, and only a couple of months ago that I quit my day job as my initiation into becoming a fully funded entrepreneur. In that time, a lot has happened as I gained erratic but valuable experience and learned from the wisdom of my co-founder Nick LaRacuente, who was wise enough to get me out of the misguided logic of thinking "I'll just build it and they will come" or expecting to have a 2 year development cycle without any kind of intermediate prototyping. And yet I can't say much about this sort of a thing because with this startup, the story is really only in its opening chapter; the most shocking, difficult and serendipitous episodes have yet to arrive.

On the other hand, there's a whole lot that led to my standing right here (or in this case; sitting in front of this computer.) My story begins, well, probably from when I was 5 or 6, but zooming forward a bit... one could safely begin in high school. In middle school, I taught myself how to program and started making video games. Most of the products were over-ambitious, but they were always fun and once in a while I completed something exceedingly simple. I remember specifically, however, in my first year of high school coming across an editorial in Game Developer magazine, which I had a free subscription to, about the potential for games to be a legitimate art form on the level of cinema, books or even fine art. The idea hit me very hard; I was your average 14 year old, mostly interested in video games, coca-cola and generally trying as best I could to look cool; but this idea for some reason was just like a shot to the head; I couldn't shake it!

Unfortunately, I didn't always have my stuff together in high school, so my projects generally got a running start and fizzled out just as fast, but something else was going on at the same time. I found myself for the first time extremely interested in literature; I started reading novels all of the time, even as it interfered with my schoolwork; for a while I even thought that I wanted to become a novelist and writing fiction of my own. Film and music also became more interesting to me; before then, I pretty much just watched whatever Hollywood blockbusters looked sufficiently entertaining and listened to whatever was loud enough to ensure hearing loss. At one point I even found myself fortunate enough to collaborate with several artists in my high school on a graphic novel; a project that actually went surprisingly well--it was certainly childish, with all sorts of elves and demons reminiscent of Lord of the Rings with a twist of Saturday-morning anime; but interspersed with poems and philosophical digressions (but don't get me wrong, it wasn't particularly good; this was high school!)

Life was actually pretty interesting in high school. Not much got done, at least not on the surface, but a lot was brewing. I still remember all of the afternoons I frittered away with friends hanging around nearby diners and walking the streets of Manhattan; but I digress, you don't want to hear that mushy stuff. But my point was something was going on in all that time; metamorphosis is never as straightforward as one things. Soon afterwards, I entered college; and luckily, a college where I'd have the same good fate of meeting friends that would help me develop further; there's another digression I need to take—good friends are important, they'll be the ones who enrich your experiences and lead you to your greatest ideas. This counts double for entrepreneurship, which is about people, your customers; not just the abstract ideas that one can go on for pages about in a novel or express in a piece of music.

Despite some initial confusion as to what I wanted to do with my time in college, I found myself as a double major in English and Computer Science; English because I had always had an interest in literature and found myself enchanted with the study of books when I tentatively took a class on John Milton and fell in love with Paradise Lost—so dense with potential criticism that I read the entire thing at a rate of about 3 pages per minute, which admittedly left me a stressed out mess as I never had enough time to get my work done and barely passed my all-important math class.

But along the way I also found a few books on my own that opened my ideas to the process by which ideas are generated. In freshman year, I came across a book by the esoteric but highly renowned game designer Chris Crawford*, which rekindled my dying interest in game design. Later, in junior year, I came across The Black Swan by Nassim Taleb—which if you read this blog (or know me in person), you know that I never shut up about it.

It's writers like these (a list of which would amount to some light name-dropping) that I hope to pay homage to in this post by using their ideas to explain the creative process and get across the simultaneously childishly erratic and gut-wrenchingly rigorous nature of innovation.

These writers, namely Crawford, left me with a new sense of purpose in my endeavors as well. I dreamed of a new kind of video game that would fit my changing self; the more serious self that found video games fun once in a while but couldn't feel engaged in them the same way as when he was a child. They were no longer as captivating, I had grown older and suspending my disbelief was not as easy as it used to be. I felt that this was what had turned most people off of gaming; and I wanted to bring the experience I had had with video games as a child to everyone. So with a new sense of purpose, I decided that I wanted to make video games that were serious intellectual challenges that engaged people's critical thinking, that would create decisions where people would have to inevitably look back and wonder about the consequences of what they did, that would make the player empathize with characters and be too engaged to move through its world as nothing more than a cold and calculating machine. I wanted to make an experience as universal as a movie or a book.

That put me on the path to today's startup, albeit in a very weird way. In junior year I applied for the honors program in computer science in order to work on a technology called Interactive Storytelling. This would be for creating games with stories that the player could change the course of through their decisions; and they would involve lifelike characters and difficult choices that would fit into complex narratives; there would be closure and catharsis, but also ambiguity. I was accepted, and soon enough I found myself overwhelmed. Luckily, a torrent of ideas came from a class in literary theory I had tentatively signed up for; which I'll talk more about soon.

To make things short, after two semesters of hard work, I cobbled together a tiny prototype and presented it to a general audience as well as the computer science department. The reviews were mixed, but I was happy nonetheless. After graduating college with Honors, I went on to work on it in my spare time before taking a hiatus and getting by with my day job as a paralegal. That is, until Nick convinced me to join forces with him to create a startup; and I couldn't say no.

Our two projects, when compared, shared a lot of theoretical qualities. But we decided to put our respective goals on hold when a new market opportunity came up. But I'm not here today to advertise our project, I wanted to talk about ideas!


So, what is "the stuff that dreams are made of?"
Well, not dreams; we all know that those come from living in your parents' basement wondering when you're going to have a car—no, I meant ideas. Where do we get those?

On the most fundamental level, ideas come from metaphors. That's right, every idea starts as a metaphor; a connection we see between two ideas because of their similarity. In your brain it looks a little bit like this two fold illustration I put up once before (you can praise me on my artistic talent in the comments section):

Photobucket
Your brain...

Photobucket
...getting an idea!

So, those two dots that lit up, what are they? Let's come up with a couple of ideas for concepts... okay, right, this isn't the lecture; just a blog post. Okay, so let's say that the one on top is the business cycle in economics; and let's say the other one is... partying! Because who doesn't love a good party, especially during an economic boom! Well, these two ideas are separated because, well, they don't really have anything in common.

Or do they? Let's think about this one. What do we do at parties? We sometimes have a few too many drinks. Did I say drinks? If any of you are religious or just generally more well behaved, let's just say extra-sugary fruit punch, the kind of stuff that will make you bounce off the walls before you get dizzy from all the running around and then crash. So, either way, you feel GREAT, and then you have a big stupid hangover; and we know that that's the price we pay for it, at least I hope we're all on the same page about that...

Now, what about an economic boom? Well, sometimes these booms are caused by a bubble. People get very enthusiastic about making a killing on the hot new asset, like say... big houses in the middle of suburbia. The price goes up 20%, then 50% then suddenly they're three times as expensive as they were two years ago. It can't just continue like this though; how many people are really going to pay a million dollars to move into a new house? Are you going to do that?

So what happens? The bubble bursts! And mind you, for those who are interested in the history of finance, this once happened in Holland over the price of tulips until... guess what? Somebody realized you can just grow them yourself. So suddenly, prices have to return to a level that actually makes sense; but with everyone having spent so much money on houses and others expecting to make a lot of money selling houses and all the people who were hired to build houses because they were so profitable, guess what happens to all of them? Not very pleasant to think; a lot of jobs, money and opportunities lost. The economy goes through a recession, readjusting to the sudden change that's happened.

But wait, there's a connection there. Something caused things to go up, to become super-active, and now that it's gone, everything comes crashing back down. And that's when we get the idea that maybe there's a connection; we realize that we need lows in order to correct the highs. So there you have it, now you know that binge drinking is a lot like a bad asset bubble.

These are the metaphors that help us advance in technology and discover new customer needs. Don't believe me? Sim City, one of the most successful video games of all time, was inspired by the reproduction of cells.** The beautiful paintings by M.C. Escher are derived from mathematical concepts of recursion, topology and infinity (among many others.) More recent AI methodologies such as neural networks are inspired by biological systems rather than straight-up computations. Even our own idea of what goes on the world is shaped by far-away stories; we look at politics through the lens of historical narratives and fictional fables. People in the United States are always asking "Are we Rome?" Sometimes overtly, but also implicitly whenever they ask about the future of that country.

This happens even on a deeply neurological level. Our brains are ultimately pattern seeking devices and look to find patterns that match other patterns. We see it even in our everyday lives; have you ever seen a shadow and thought a dangerous person was waiting in a corner to attack you; only to find out that it was a garbage can? What about being startled at the sudden rustle of leaves in the middle of silence? This is pattern recognition at its most basic, and it finds its way even into our highest levels of thought.

But before dismissing it all as illogical and irrational, you should remember just how smarter you, or anybody reading this post, is than a computer. That's right, we're so good at this game because we're not computers! Embrace it, be childish and let yourself find connections wherever they are. But how do we go about doing that? Well? Two things.

First, read. Read on every subject you can and don't worry about working on your specialty. You have your entire life to learn the newest Java API or the latest updates to the .NET framework (this part is for tech entrepreneurs; ignore it if you're just a reader that happened to get this far in my rant.) In fact, you'll find that the demands of most situations are so specific and strange that you won't be able to figure out what you need to know in advance; so don't sweat it (but do study your fundamentals, they come in handy.) Pick up some fiction and poetry; or if you're the more serious kind (like myself for a while), bury your nose in some history, economics, psychology or anthropology. Be sure to learn some math too; big concepts that can help you understand things as systems. But you should all in all be looking to understand things in three categories: systems, ideas and the human condition. The third of those comes in fiction and poetry; stuff that really gets at what makes people tick, but that also requires doing another thing...

Live! There is no erudition without experience. Or put another way:

  "Experience, though no authority Were in this world, would be enough for me" -The Wife of Bath


A lot led to where I am now; much of it serendipity and a slowly accumulated pile of books, but I'm most prepared for where I am now from the experiences I've had messing up, doing the wrong thing, doing all kinds of things that left me kicking myself in frustration. If it weren't for that, there would be nothing to hold all this together, just a bunch of gobbledygook.

Mistakes, there's an important word, and it reminds me of another question. What do we do with all these crazy ideas? Where is the empiricism? Ah, there's another idea I have to credit Chris Crawford with...


A T-Rex for ideas!

That's right, our ideas are just all vulnerable little sheep, and we need a T-Rex to rip all but the toughest of them apart. That means attacking your ideas from every angle. Forget all this about being nice to yourself, be a contrarian, drive yourself crazy; be like that kid who keeps asking endless questions about why he needs to take a bath! Not only that; go out and talk to people about your ideas. This is how Nick and I figured out what to do with our own product; we went out and talked to people. And as always, you don't always hear what you want to hear; but that's good, you're growing. Finally, go out there, and just make some mistakes. Not too big though, I don't want anyone coming back here with a missing limb!

As Kanye West once said; "that that don't kill me, can only make me stronger!" So go out there and get stronger, I'm rooting for all of you. And thank you for reading.

---------------------------------------------------------
*He has a wonderful website for those interested: http://www.erasmatazz.com

**See John Conway's The Game of Life

Saturday, August 13, 2011

Consciousness (In 3.5 Steps)

I spent way too much of 2010 pondering about metaphysics, so I gave it a break until recently when I cracked open Godel/Escher/Bach by Douglas Hofstadter and read the first chapter. That was enough to fill in the blanks to a fragmented theory of existence that I had been churning around for most of 2010. Here are the basics; I will elaborate further if anyone is interested:


1) Existence and Consciousness are inseparable

From an epistemological standpoint, I don't see any way these can be safely separated. Therefore, I'm assuming that they can't. There are other reasons, which are still a mess in my head, but I am more or less working from what I consider the indisputable truth of subjective experience.

2) Experience is the friction of incomplete information

Why do we never seem to remember locking our door? Why do we remember surprises more than routine things? Because we failed to predict, thus signifying new information. In information theory, it's called information content and is measured by how many bits it takes to store; if a probability distribution has a single outcome that is 100% likely to occur, that probability distribution has no information content.

Another way of thinking about this is that information content is also called entropy*, which in physics is the irreversible disorder that arises from motion, which is very related to that other phenomenon we call friction.

Wolfgang Iser brilliantly applies this theory to the experience we have when reading books and just about everything else in life seems to follow. When are video games boring? When there's no challenge left. When do movies suck? When they're way too predictable. But thankfully, things are always at least a little bit different than we anticipated them.

3) You need identifiers to signify difference but you need difference to distinguish the identifiers from one another; this paradox creates a regress loop.

To put that in English, let's consider the colors of objects. If everything were the color red, we'd have no such thing as color. We notice things because there are differences. This ties back in to what I said about information content in the previous post; if there's a 100% probability of one particular outcome, then there is no information.

So if we want difference, we need some trait in which these things are different. But where does this trait come from? It would need to be distinguished from something else. Therefore, we're faced yet again with the same problem ad infinitum.

One last thing is needed then, to close this odd regress loop. Once we've done that, we essentially have a system in which we can continue to seek information in a cycle of anticipation and surprise but never reach the end:



While writing this, I realized that I was missing the answer to this part. I will try a basic sketch of this:

-There is some kind of self-reference that closes the loop of this problem with infinite levels of "difference"

-The result manifests itself as a paradox (or many of them--perhaps they all mean the same thing) such as Godel's Incompleteness Theorem or the Time Dilation paradox.

-This creates a dog-chasing-its-tail effects in which a recursive pattern creates a fractal of uncountable codes--the friction of these being what we know to be reality.

But feel free to take a stab at the last part and let me know what you think...



I should note that by the time I finish Godel/Escher/Bach, it may be the case that Hofstadter beat me to the punch; in fact, he says that his thesis is that Meaningless Symbols Acquire Meaning Through Self-Reference, but so far it has proven to simply be the connective link that I have been missing between the three steps that I outlined above; I just hadn't put them together the right way, but they were all there a year ago.


------------------------------------------------------

*Forgive me if I've abused the poor term; I'm not a physicist.

Friday, August 5, 2011

Oops, just accidentally posted on the wrong blog; was writing technical notes for the project.

Will update here soon! Maybe some links?

Wednesday, August 3, 2011

A definition of narrative?

So many things I wanted to post about, so many work related distractions and so many excuses. Here's one for brevity.

My structural definition of a narrative:

A collection of events* that makes sense of itself without the use of axioms.


*Was hesitant about using the term "event", but I think that it's more precise than saying "signs"--there's a self similarity here I think; the idea of an event is itself inscribed in narrative.


Thoughts?

Thursday, June 23, 2011

Reading this amazing blog post from evolutionary psychiatry and realizing how classical and medieval theories of health and psychology show just how ridiculous our own modern theories of the mind and body must actually be. I think our overconfidence is best described in a quote by Searle about the brain:

...I cannot find the book I found it in. I will find it later, sorry. May be in a book I left downstairs.

That Old Humanities Argument

We've all been asked this question before: why study the humanities? Why teach literary criticism? What is the point of learning about things that never happened and inapplicable interpretations? Math and science have certainly been hyped up in the past few years by the presidential administration with the looming economic fears of being uncompetitive in the face of rising powers and the political activism of people who want to see our money spent better.

As an English major in college, I had to struggle with this question myself; I was asked by other people and even had to deal with the prompt in a literary theory class (which I utterly failed at doing.) It is worth noting that a lot of my work in literary theory has been applicable to what I'm engineering now, but that's besides the point to me because I know that it wasn't all that this was about. I certainly see literature as an important metaphysical experiment for philosophers, but I don't know how I feel about philosophers either.

But around a year ago I had read about Stanley Fish's book Save the World On Your Own Time, which apparently* argued that trying to find some political or economic justification for the humanities denigrates it by denying the idea that it may just be good in of itself (does everything really come down to money and survival?)

While I was taking a break today that line of thinking crossed over with all the time that I've spent thinking about Edmund Burke and Nassim Taleb and it dawned on me that I had been missing the obvious for years; that perhaps the importance of literary criticism is in the fact that despite having no explicit justification for it, we still continue to read, teach, analyze and deconstruct stories; value a supposedly "arbitrary" literary canon, ask questions about things that never happened and give interpretations in the absence of right or wrong answers.

To me, asking why we value literature, spend so much time teaching it to students and even have tenured academics who spend their whole life studying it is like asking why we have religion or inauguration ceremonies or act hold doors open for other people. At this point, it's tradition and part of a deeper logic that we can't ever presume to understand--a point made tirelessly by Edmund Burke in the wake of a disastrous French revolution based on simple top-down models. Simply put, I don't think that the world would be better off if we stopped holding doors for other people and I don't think that we'd be better off not studying literature.

On a more concrete note, I think it is possible to glean the value of literary criticism and it is related to the importance of things beyond economic concerns. We live in a world richly populated by cultural phenomena. Being part of that world means understanding our cultural heritage and our shared idea of what it is to be human (cliche, I know, but isn't it true?) Would you refrain from teaching your kid table manners or how to talk to elders?

Whether it's through high school English, Hebrew School or wrestling in the grass with your classmates, we all have and all need rites of passage. Part of the humanities is spiritual training, the rest is something else.

----------------------------------------------------------------------------------------------------------

*No, I haven't read it, just making that clear. A second-hand summary did in fact raise some interesting points for me.

Monday, June 13, 2011

Childhood

Working on my own and trying to start a business has unsurprisingly been the source of a lot of anxiety. Sitting inside all day working starts to take a toll and even the weekends can be challenging if I don't get myself out and about. But as I was taking a walk today, I started to look around the city streets and enjoy them in the way that I have for so many years and this song popped into my head:


Most people would wonder why I remember such a song, let alone have the mp3 file for it; it's a bit nerdy to say the least. But for me, it has a very strong association with my childhood and evokes a refreshing sense of playfulness; everything becomes more atmospheric and a strange grandeur imbues itself on the symbols that populate the city skyline and the scenery of downtown Brooklyn.

It takes me back to the visceral pleasure I found in a game that most people passed over and considered one of Will Wright's more mediocre works. While most people didn't care for it, SimCity 3000 was the first time that I came to an understanding of what I was looking for in games, a sense of exploring a narrative, of decisions that impact people and the often contradictory and conflicting demands that they make both as individuals and as a whole.

The music of that game, especially this song, now brings me back to a sense of childlike playfulness; a feeling that to me is the ultimate remedy to anxiety--a sense that the world is yours to explore at your leisure. I never thought of childhood as idyllic, but it always did offer the opportunity to play; it placed a priority on learning, it gave space. In my quest to find the intersection of narrative and technology, to bring to the rest of the world the visceral inspiration that lies behind the screen of a personal computer, I've come full circle in having the opportunity once more to learn, to grow, to be a kid; being an entrepreneur might be a lot of work, but I once again feel the ability to explore the world and leave my mark on it.

Friday, June 3, 2011

As I begin work on Fear of Software's new project, which I will talk more about later; I find myself having to stay disciplined in order to get even the most creative and "inspiration" oriented tasks done to adequacy. An excerpt by Chris Crawford comes to mind:

"In late 1981, Dr. Alan Kay recruited me into Atari Research and challenged me to dream. Most people take a lazy approach to dreaming. They put their feet up on the desk and engage in idle mental forays for half an hour, and they call it dreaming. To me, dreaming is a much more deliberate and difficult process. Dreaming is hard work!"

Thursday, May 5, 2011

The Confirmation Bias

Everyone knows what the confirmation bias is. Here's an especially good article on it:


He goes on to say many things that are well known; that we look for corroborating evidence, that we're not good at hearing or processing facts that are against what we think. He cites a scientist, however, that takes this a step further and puts the whole thing together; that we don't reason in order to objectively understand the world but in order to persuade others.

Now, I'd say that it's also to persuade ourselves, but I'll get back to that in a moment. What struck me is that knowledge is mostly ornamental; our abstract reasoning is subservient to our social reasoning. As Lehrer puts it so masterfully:

"Instead, the function of reasoning is rooted in communication, in the act of trying to persuade other people that what we believe is true. We are social animals all the way down."

This seems to me to be very deeply connected with the notion of cognitive dissonance and identity. For those who haven't read my previous posts, I was suggesting that we rationalize what we do and look for evidence in order to do so because it's what regulates our identity; that acting in a totally "rational" would bring about symptoms resembling schizophrenia* Think about a computer generated actor that always makes the "optimal" choice; it would confuse the hell out of us as it would change behaviors without any warning or visible precedent. The body must fight off what is foreign and so must the mind.

This is why I say that we also look for corroborating evidence to persuade ourselves. Really, though, I didn't just bring this up to one-up Mr. Lehrer; I wanted to say that persuading others and persuading ourselves are one in the same. If we find corroborating evidence in order to construct a story for others and do the same for ourselves, then it seems fair to say that narratives are the primary way in which we perceive the world. Our instinct, when we debate with others, is to try to tell a great story; no different than when we seduce or entertain and also no different than when we make sense of our own situation by constructing an identity.

The story we tell ourselves and the stories we tell others are two sides of the same coin.


--------------------------------------------------------------------------------------------

*If anyone reading this deals with people who have serious problems such as Schizophrenia, I apologize in advance for anything presumptuous I may be saying. This blog is a loose network of working hypotheses, many of which will turn out to be extremely silly.

Sunday, May 1, 2011

The Triad

1) When I was in high school I was already thinking about semiotics, albeit without any knowledge of that word. One day in high school, my biology class took a trip to this art activity held by a resident artist who had us all draw "three things that we were made of." A lot of people drew funny and goofy things, I think I took it too seriously. My three things were zeroes, ones and eyes. Zeroes and ones because those are the only two ingredients you need to create information. Why eyes? What is information without something to observe it?

This is still a haunting question in semantic and semiotic theory. That said, two might be the number of symbols (difference), but three is the number of signs. Another reason that I think three is the key number is because signs are not about one-to-one relationships between terms (that's a symbol!) Signs are about multiplicity and ambiguity; a sign has connotations and ambiguities, it can be used not only to name but to give hints and even lies. You need at least three nodes in a graph (in layman's terms, network) to create something more than a simple one-to-one mapping.

2) Speaking of signs and triads, here's a quote that I found from The Name of the Rose that seems to implicitly talk about the triad of signifying/inferring/lying that I talked about earlier:

"It is of use to me as Venantius's prints in the snow were of use after he was dragged to the pigs' tub. The unicorn of the books is like a print. If the print exists, there must have existed something whose print it is. ...The idea is sign of things and the image is sign of the idea, sign of a sign. But from the image I reconstruct, if not hte body, the idea that others had of it." -William of Baskerville

Saturday, April 30, 2011

Minimal Interfaces/Information

I just installed Starcraft II on my new laptop because it's great to actually have a seriously powerful computer these days, but it made me think about a mix-up I had with one particular unit (Thor, for those interested.) I thought it was weak against aerial opponents because the numbers were particularly low and there was also a not very high number for rate of fire. Turns out that it's very powerful at taking out groups of aerial opponents because its salvos are more powerful than the individual missiles.

A small oversight that was corrected with experience, but this seems to be a problem in a lot of RTS games. The numbers are misleading and make it easy to misunderstand which units are good against others. Starcraft II added a lot more information than the original Starcraft (thank God, when I was a kid I really misunderstood counters because I only looked at the system), but ultimately we're not suited to read these numbers very well when there are so many of them.

Command & Conquer was much different in that it only showed vague approximations and the player figured out the strength and weaknesses of units via general descriptions and experience. For one thing, I always felt that made the game feel a bit more real; but my real point is that all that information misleads us and retards the process of learning how it actually works through experience.

I think that strategy games, in general, should do away with most of the explicit numerical information; it's more misleading than informative.

Tuesday, March 15, 2011

On Stories and Trying to Figure Them Out

From the end of my senior capstone paper in English on interactive storytelling (and once again, no idea why the font gets periodically messed up):

The hodgepodge of theories, concepts and examples with which I’ve explained the problems of interactive storytelling has not been one that lends itself to being quickly wrapped up in a concluding page, let alone five. It would perhaps be a more fitting note to end on a story; to further interconnect the many concepts explained, and if we’re lucky, see a new idea emerge. After all, if there is one thing us academics can agree on, it’s that he who of those delights can judge, and spare to interpose them oft, is not unwise.

Only a year ago, I had been recently accepted into the Honors program in computer science to begin a project whose name was simply “Interactive Storytelling.” My proposal, although over ten pages, contained little of what was needed to understand anything of the problem at hand; my understanding of narrative was confined to narrow formalist ideas of plot and a handful of scattered critical concepts.

By chance, I ended up in a class on literary theory after not making it into a different English class. Combined with the various books with which I supplemented my project, I found myself understanding narrative in ways that finally bore fruit. Combining my knowledge of mathematics and literary theory, I began to see stories themselves in a different way, abstracting them to structures juxtaposed in varying dimensions.

Nonetheless, as every structuralist knows, rules are meant to be broken and as I found myself writing this paper, a tension between the easily imagined existence of signs within systems and the inability to clearly explain their relationships to one another within some space kept me spending an entire day wondering what I was really saying, how I could find that balance between saying what needed to be said and leaving the rest open. I saw then what Stanley Fish talked about in another part of Interpreting the Variorum, a moment of tension in the process of reading.

That tension has seemed to exemplify the work I have done on interactive storytelling in the past year more than anything else; a constant tension between the axiomatic and the narrative. Mathematics is closed; there is always a clear path to understanding it. But only narratives endow us with any kind of meaning; even the great Paul Erdös would always say after proving an exceptional mathematical theorem “it’s in the book!” I’ve been standing on the edge throughout the entire year; interpreting stories with technology, looking over as I make another carefully thought out conjecture.

Yet there has been a remarkable sense of fulfillment from it all. With each day, my own project and my own writing on the subject become increasingly robust and meaningful, even as I come to terms with this lack of understanding. Every day I am capturing narratives more, not less, even as they appear more elusive with each day. Perhaps more than anything I’m gaining satisfaction from this very lack of reconciliation. After all, this tension is the mark upon us which narrative leaves; narrative exists in the friction that comes from not quite understanding, from the vulnerability of any structure we create, no matter how simple. And if such tension is what defines a narrative, then this project, this critical technical practice, is one as well.

Tuesday, March 8, 2011

Lies and the Lying Semioticians That Tell Them

I want you all to meditate what I'm currently meditating on:

A sign has three functions:
Signifying: ("dog" means a dog; and please, don't think too hard about that*)
Inference: (smoke is a sign that there is a fire happening nearby)
Lying

These are all related.


------------------------------------------------------------------
*The ambiguity of semantics is reflected in this entire entry

Friday, March 4, 2011

Note: It's always a good idea to take the precaution of saying "biological" instead of "natural" in order to throw smartasses off your trail.

Sunday, February 27, 2011

Men In Black

About a month ago I caught the second half of Men in Black on television; a superbly made movie in hindsight. There were a lot of things I liked about it, but I think the most important thing was that unlike a lot of science-fiction themed movies, which support their plot with a well articulated con-world, Men in Black simply relies on a series of quirks: tabloid style vignettes portraying the protagonists' and our own lack of understanding of everything happening.

Saturday, February 26, 2011

People talk to their friends and see psychiatrists in order to reconstruct their personal narratives—and yet when people do the same thing via religion, it suddenly becomes a debate about religion being "unscientific" or "misleading" or some other nonsense. Just saying.


EDIT: I just realized that the only way I can articulate my defense of religion is with these little vignettes. I have an idea of why, but I don't want to get into a long essay about it (talk to me—or leave a comment if you're interested in that sort of thing.)

Friday, February 25, 2011

The Garden of Platonic Forms

More on this later perhaps; time permitting, but I may have stumbled upon the fundamental essence of post-structuralism:

Literature has no platonic forms, invalidating the underlying assumption since Aristotle that literature is a way to bring us closer to Platonic forms (Plato believed that poetry and drama was "thrice removed from nature", meaning that it was an imitation of an imitation; but Aristotle believed in the power of poetry/drama to bypass the limitations of Earthly manifestations.)

Structuralism was the most rigorous* way of doing this, but eventually saw its own limits; thus becoming a study of this very cycle of interpreting and then finding new meaning by figuring out how the interpretation contradicted itself or simply didn't apply. Perhaps that's why I have as hard a time as the French understanding any real difference between structuralism and post-structuralism.

--------------------------------------------

*Please take "rigorous with a grain of salt.

Sunday, February 13, 2011

Two Social Psychology Observations

Cognitive Dissonance: A long time I wrote that I thought that cognitive dissonance was an evolutionary adaptation that made people in groups more predctable and trustworthy and ultimately facilitated the use of narrative as a social heuristic. I still think this is true, but I also think its function is important on an individual level; in order to explain, I'll point out an observation by the computer scientist and cultural theorist Phoebe Sengers:

In listening to Julie, it was often as though one were doing group psychotherapy with the one patient. Thus I was confronted with a babble or jumble of quite disparate attitudes, feelings, expressions of impulse. The patient's intonations, gestures, mannerisms, changed their character from moment to moment. ...It seemed therefore that one was in the presence of various fragments, or incomplete elements of different 'personalities' in operation at the one time. Her 'word-salad' seemed to be the result of a number of quasi-autonomous partial systems striving to give expression to themselves.
out of the same mouth at the same time.
(Laing 1960, 195-196; quoted in Phoebe Sengers' "Schizophrenia and Narrative")

Sengers uses this example to make a point about how current methods in artificial intelligence (AI) operate: an optimal behavior is selected on the basis of some utility metric without any regard for what was done in the past. This disregard for consistency is optimal for any agent within a game-theoretic model (classical economics, chess, etc.), but if a person were to operate this way, their identity would be completely inconsistent, every action acting completely out of context with any other one. In more simple terms, acting in this manner requires completely throwing out one's identity.

To a theoretical economist, this might be great news; after all, it was once said that consistency is the hobgoblin of all minds; but we shouldn't dismiss the importance of a consistent identity so fast. Reading through Antonio Damasio's Self Comes to Mind, I took note of his explanation of consciousness (synonymous with subjectivity, and in this blogger's humble opinion, narrative) as a more evolutionarily advanced mechanism for self-regulation, the process by which all life forms, from prokaryotes to mammals, fight off external chaos and maintain internal order. Self-regulation is the essence of all life, and consciousness is the most advanced tool for doing so by allowing us to make complex plans and coordinate seamlessly with large groups; but more than that, consciousness regulates itself in the same manner that life-forms do—it is a consistent narrative that maintains itself and in doing so endows us with identity. Without a cohesive narrative, we cannot have a subjective identity; and without a subjective identity, our consciousness, that most crucial of means of human survival, quickly disintegrates. If you don't believe me, read the passage above one more time.

Cognitive dissonance is a way for us to fight external entropy, no different than the regulation of our body temperatures or our immune system's constant attack on what's foreign to the body.


2) Protagonists: On a much shorter note, it occurred to me that the presence of protagonists in stories is possibly a symptom-of/appeal-to our own universally shared narcissism.

Thursday, February 10, 2011

If On A Winter's Night...

It was cold as balls tonight as I was taking a walk in only a hoodie and some gloves after a 20 minute workout. A lot of doubts were running through my head and, of course, passing by the scenery of Brooklyn Heights and Dumbo I had the fleeting fantasy of being a twenty-something year old millionaire with a condo perfectly overlooking the East River.

Can't survive on such self-indulging fantasies; and if you have to wait for that moment in your life to find meaning, then million dollars or not, you're stuffed. That tied in with all my Taleb-inspired thoughts about randomness and how to live with it and reminded me of part of why I came to love studying narratives:

Narratives are the only thing that truly belong to us. We can't (for the most part) control what happens to us, but we can choose how to make sense of it. We can't control the material consequences of our situation, but we can choose how they shape our intentionality. I feel as if more than anything, this is what atheists misunderstand about religion; the primacy of subjective narratives over objective facts.*

Narrative lets us live for the moment as connected with the associations of the past and the projections and unknowns of the future.


-------------------------------------------------------------------

*It may be accurate to say that what religion deals with is eminently subjective; the multiplicity of narratives in our lives that can only be understood as subjective experiences.

Friday, January 28, 2011

Intuition and Mathematics: A Defense

Back in my days at Oberlin College, I had a memorably good math professor who I consider to be one of my mentors. Her grasp of the subject and the toughness of her problem sets helped me learn math that I hadn't thought myself capable of learning at the time. There was one point she made, however, that in hindsight I can't help but adamantly disagree with.

She said that intuition was what messed mathematicians up. Her points leading up to the conclusion were valid; mathematics is a language—you have a set of symbols and a set of rules for manipulating those symbols; and to prove something you follow those rules precisely, moving the symbols into the configuration that you'd like. Intuition can fail; but math never lies.

This fact about math has given way to some amazing tools; not least the ability for a computer to prove new theorems by arbitrarily moving around symbols until they find something of note. Truth is, however, that, as far as I know, these automated theorem solvers have proven very little of value compared to the work of mathematicians; most applications of computers to theorem solving are programs that are custom written to deal with the details of making a particular point such as the proof of the Four Color Theorem. But I digress.

My professor was certainly right that intuition is flawed whereas following rules doesn't fail; but what she didn't acknowledge was that our intuition can always be verified by checking the logic of our hunch; we can make a guess as to what the nature of a proof might be; following which we write up a proof to see if it logically works. In fact, I think this process of guessing and verifying is essential to solving problems in mathematics.

This point can be best made by a certain concept known as computational complexity (if you're not a technical person, please don't panic! It'll be clear in a minute.) Computational complexity is a measure of how difficult it is for a computer to solve a problem; different problems are organized into different levels of computational complexity based on how hard they are to solve. I'd like to focus on two, known as P and NP. P stands for polynomial—the problem can be solved by an algorithm in a polynomial number of steps. NP stands for nondeterministic-polynomial—the problem can't necessarily be solved in a polynomial number of steps, but if you were to guess the answer to the problem, you could see whether your guess was right in a polynomial number of steps.

Now, to be fair, it might be the case that there is no problem in NP that is not in P (but it's highly suspected that this isn't the case.) Assuming that this isn't the case, however, then there are a whole slew of interesting problems that can't be easily solved in a reasonable amount of time, but could be proven were somebody to make a "lucky guess."

But how "lucky" is that guess? Our intuitions solve extremely complicated problems with a very high accuracy rate. They don't prove anything, but they make very good guesses with a very high chance of success in a very short amount of time. After all, our brains are some of the most complex computers imaginable, with complex networks made up of trillions of neurons and connections; we can recognize faces, navigate social situations with countless unwritten rules and outsmart a computer at just about anything that's not a game with transparent rules (and computers still can't play Go for shit.)

Our intuition allows us to stumble upon the answer to a problem with a very good chance of success. After that, we can use logic to see if we've done it right—as the P/NP distinction illustrates, verification isn't nearly as costly (in most cases) as solving.

[End of neat, pre-packaged entry; on to messy stuff]

(Warning: I am less sure about the following. Feel free to correct me on this; and as always, argue with me.) Another way of thinking about this is that with each level of computational complexity, the amount of solutions to the problem becomes increasingly non-linear. Logic can only go so fast (at a linear speed, I suspect) and can't keep up with the increasing complexity of problems. Our intuition has two particular strengths:

1) It runs in parallel, which allows us to detect patterns very fast. Our neurons don't even fire all that fast—their effectiveness is in that we can distinguish information using space rather than time. No computer touches us in that regard

2) It is built to approximate and work in probabilities rather than to think of things in absolutes, which is what my professor's argument encouraged mathematicians to do. Approximations don't always solve the problem, but in a lot of cases a small decrease in accuracy (say, a 10% chance you're wrong), can lead to being able to find an answer much more efficiently. So long as we can verify the answer in an efficient amount of time (that's what proofs are for), then mathematicians can have their cake and eat it too.