Wednesday, February 18, 2009

Dream on, dream on!

I don't usually like talking about dreams, people generally have their own interpretation, and I know all too well that people usually groan when someone is about to tell them about a dream they had. But for the sake of understanding the science of it, which is important in the field I'm considering, I thought I'd break it down a little bit.

First of all, I never bought into Freud, who mostly determined dreams to be stimulation of our primal unconscious. Instead, I was always fond of the Jungian approach, whom saw dreams as a sort of window into the self, they were the ego's way of figuring out itself, that it was an innately spiritual experience.

Jay Dixit however, explains in more concrete details, that dreams are a sort of theater of threat rehearsal, they help us problem solve foreseeable practical problems, whether instinctual or empirical. This particular article is quite interesting, in that it explains a lot of things we already know about dreams, and gives them a perspective that actually makes a lot of sense, while still implicating that there's probably a lot more to the puzzle that we've yet to figure out.

As far as the possibility of using dreams as a therapeutic device, it seems completely rational to me to use dreams as a really effective tool for solving personal problems, traumatic or otherwise. If we could somehow harness the power of suggestion and apply them to dream-like scenarios, then people could solve these problems in their dreams by confronting them head-on. I've never heard of a dream where someone knowingly takes the wrong approach in solving a problem, so I'm leaning towards the thought that if therapy could become effective enough to introduce a potential dream scenario, and suggest possible solutions, then when the actual dream scenario occurs, the patient reinforces in his own mind the solution to that problem, while gaining the benefits and confidence associated with that particular feat.

I'll look more into this as a possible experiment opportunity.

In other news, registration opened up for Seattle Central Community College, I almost signed up for the classes I wanted, only to find out that since I didn't attend Winter quarter, they temporarily locked my registered status. I fixed it by calling them, but apparently I can't officially register until tomorrow. No biggie. Unfortunately, there aren't very many relevant courses for me to take this quarter except for math, so I'm just going to take a couple semi-related classes just for fun, and because I want to become used to taking a full load for the remainder of my school career. And I want to reap the benefits of financial aid. (Which is counter-productive since financial aid on three classes doesn't outweight the full cost of one class, but the other points are still valid.)

That's basically it. Sorry I haven't been posting items too often, I'm going to try to make them shorter and more numerous from here on.

Wednesday, February 11, 2009

The muse is a fickle bitch

So it seems our good friend Ray Kurzweil has recently received support and funding from both Google and NASA for his Singularity University, which has set off a certain signal in me that maybe his ideas might be publicly adopted a lot sooner than expected. Even if given a tiny bit of credentials for his theories, 2030 isn't too far away, the current deadline for total convergence. But enough about that, read my prior blog post that explains all about the Singularity in pretty good detail.

What interested me about the article about the Singularity University was the question of how a generation could possibly prepare for such an event. Trends are definitely useful, but when these trends hit a point where we can no longer accurately predict any future event with any relevance, what's the point of preparing in the first place? The trends suggest that convergence will act in humanity's best interest, but we might not be the end-all be-all of technological progress, especially if we actually do end up designing self-replicating super-human intelligent machines. There's simply no way to predict their motives and developments much in the same way I'm sure mice have no idea what the hell we're up to.

But honestly, I'm not too worried about it.

Speaking of trends, it seems that new technologies are making themselves more efficiently integrated into society increasingly by means of entertainment, (as opposed to military and medicine) later on this year we're going to see an influx of mind-reading games that I'm sure will raise an entirely new strain of general awareness concerning mental/psychological technology that's going to be more exponentially relevant in the coming decades.

Scientists are hoping that by 2019, we'll be able to genetically map every new-born Gattaca style in order to prevent and forewarn genetic defects. Beyond genetics, many new studies are focusing in on reverse-engineering the brain in hopes of grasping the technicalities of how the mind works in far greater detail.

But back to entertainment... yes, entertainment. Entertainment is a fickle industry, what works one year doesn't always work the next. Innovation is one of those fleeting things that we can't really maintain a firm grasp of, but its my hope that an integrated society will present itself with more opportunities for innovation by continuing its current inclusive trends. (See wikipedia, twitter, facebook, etc)

I've spent a lot of time trying to analyze what these trends mean, from the popularity of the Nintendo Wii, to the increasing trends in casual games, I'm seeing a market that's preparing itself for an explosion of some kind, but I can't quite foresee it yet. Games are only a part of the puzzle; Netflix Instant-Watch, iPhone's App Store, Steam-Powered; I don't see these ideas as simply as entertainment going digital, I see it as trends highly favoring an open-market free-exchange instant-access orgy of convergence and integration. It has already consumed the lives of so many, and that number's not going to drop - once production costs go down, once development becomes easier to use, once technology becomes so intuitive that the means of which we access it becomes invisible, then we're going to have an entire planet working together providing services and entertainment to each other to a degree that's simply impossible to keep up with and the ante's are going to raise much higher. (Kind of like how it is now, who can honestly say they can keep up with the entire internet? We can in general terms, but not specifically.)

But the game industry in particular, I find it to be an anomaly. It's a relatively new industry, sure, but it hasn't come even remotely close to being treated as a novel and artistic medium as comparable to other mediums such as film, literature, and music. Granted, it's the youngest, most expensive, and most difficult to produce for, which is why it's so important to help guide trends to solve some of these problems to work out the quirks of the industry. In my humblest of opinions, for lack of a better word, video-games need to go Indie. The stakes need to be made lower so that developers can put more resources into creativity and originality. Big game producers can stay where they're at, there's no problem with having a Hollywood for video-games, but they can't be relied on to push the industry forward, there's simply too much money at stake to take massive risks, which is exactly what the video-game industry is to investors.

Thinking back on it, innovations in games are really far and few between, considering the limitless potential of the dynamic implications of what a video game really is. You need to give the player control over something, you need to introduce a 'game mechanic', and you need a reward system to push the player along. That aside, the means of game-play and the player's relationship to the game could be anything, absolutely anything, and the artistic implications have been largely ignored, save a few exceptions. Largely, arcade games such as Tetris paved the way for many like-minded games up to Bejeweled and its many incarnations. Asteroids largely paved the way for top-down shooters, and while art direction has improved, (See, Ikaruga) the art direction and game-play have absolutely no correlation with each other whatsoever other than giving the player something pretty to look at while he shoots things. Doom becomes Crysis, Mario Bros becomes Metal Slug, Populous becomes Civilization IV, and, well, you get the idea.

The problem is that the experience doesn't match the game-play, the game-play becomes a monotonous and familiar experience with little twists built in that make the experience seem fresh and new again, but this is simply a mask. The challenge of the game is lost, at least as far as challenging the mind to truly allow itself to be immersed into a new and vulnerable experience. Modern games can't do this because the formulas are too thick, the experiences too familiar. The problem lies in the developers not engaging their art-form, and therefore the audience has no hope of engaging the product to a meaningful degree.

Of course none of this surprises me, the video game market is far from broken and as long as people buy into what is currently available, progress for deeper innovations will remain slow. But, what if, just hear me out here, what if there was an insurgence of artistic games, a re-examining of the very fabric of an entertainment experience, a movement, if you will, to revitalize the game industry into something deeply profound, something that got along much better with the way the world is headed in... what if? Its not a dream, really, it's the natural progression of things, it's just a matter of time, a question of when.

Sunday, February 8, 2009

The formation and inspiration of a hypothesis

So I'm sitting at my desk, drinking chai tea, browsing random tidbits through Google Reader, and I start to think about the brain, as I usually do, and a novel idea hit me: lateral thinking as a system to improve brain functionality!

Okay, so the idea isn't really all that novel, but it is a good enough idea to jump-start a line of personal research. Therefore, I'm going to incorporate the scientific method into this blog-post by creating a hypothesis, and my ongoing research will be aimed at either verifying or falsifying it. Why do I care so much about proving this? I think it could potentially lead to profound discoveries in psychological therapy and related fields. Easier said than done, however, most research I've conducted so far on this matter has led me to find more contradictions than proofs.

Take this article for example, which is a well-researched article on generally accepted neuroscience on different regions of the brain and how they operate. But it doesn't take neuroplasticity into question, a recent discovery that the brain is far more malleable and dynamic than previously thought, and has already led to amazing advancements for the sensory impaired. Basically this research heavily suggests that the brain is highly adaptable and is virtually unchanged from childhood into the adult years, and theoretically social conditioning is responsible for dumbing down our ability to retain information and think creatively.

There's a research project I have found myself frequenting recently called reciprocality, it tries to define concepts that are paradoxically impossible to define, but what they refer to as a condition that affects almost everyone that makes them more oriented to praise routine-based behaviors. The research started by noticing that some software engineers fared exponentially better at their jobs than most others with no clear explanation as to why. Because of the nature of the industry, employers have learned that if they wanted things done in a timely fashion, they needed people who were trained to do very specific tasks. Projects that treaded new territory or that were adapting to new technologies had no way of respecting any certain time-frame for their projects to be completed, since their obstacles are unforeseeable and took different levels of creativity to solve.

The research goes on to find the differences between people who think one way as opposed to another, how one group is more adaptable to finding solutions to problems and thinking outside the box while the other group is more likely to remain stuck 'in the box'. Its profound to some degree, people that do one thing well can only do that one thing well and have no frame of reference that will ever allow them to be able to get themselves outside the box they're trapped in, especially if they don't even know the nature of the box. The world we're turning into today is one where enough people are able to collaborate on projects where the variable areas of expertise allow hurdles to be crossed more efficiently, but I imagine we'll soon be treading into a world where everybody is more prepared to learn a more general set of skills that will allow them to solve more complicated and abstract problems.

That said, I have formed a general hypothesis: Abstract thinking may strengthen brain versatility and general problem-solving skills. The idea is based on the theory that linking different parts of the brain together in unusual ways allows for a greater level of idea association and the strengthening of neural pathways. Though more than proving this hypothesis to be true, I want to study the most effective way of strengthening the general adaptability of the brain in hopes that the mind will be more predisposed to exposing itself to new ideas and ways of thinking, you know, to be perpetually progressive. It might be the Buddhist in me looking for a scientific way of achieving transcendence, but hey, I don't see how this line of research could hurt anyone, so lets have it, shall we?

I won't go into too much detail about how I plan to support this hypothesis, but I will give updates on my research and findings periodically as well as show my experiment reports. This specific project will probably be running for a long time and I may decide to participate in smaller scientific pursuits to fill in the gaps. I will say however that I plan on using the principles of lateral thinking and apply them to more areas than just critical thinking. The idea is to find a system for linking different parts of the brain creatively and abstractly to see if causes significant changes in rational thought and problem solving skills.

In slightly different news, I continually find myself really enjoying lectures on how the brain works, the creative process, and discovering new and hopeful approaches to solving modern day problems. Anyone else that knows what I'm talking about has probably heard of TED, an annual conference where experts in a wide range of specialized fields are given a loud-speaker in order to exchange ideas with their colleagues. Appropriately subtitled as "Ideas worth spreading", the general public is encouraged to listen to these lectures in order to inspire a progressively oriented future. One incredible example of an inspiring lecture is Jill Taylor's recollection of a brain stroke through the eyes of a neuroscientist.

One of the reasons I find talks like these so fascinating is because it gives you a clearer view of how the brain operates in certain situations, giving us a glimpse of how our own minds could operate if we just knew how to control it better and take away the walls that limit the experiences of our day to day lives. It reminds me that the brain isn't just used for solving algebraic equations, but is actually necessary to witness beauty and profound experiences and to trigger dramatic life-style changes for the better.

Friday, February 6, 2009

Drugs make smart!

So the other day, me and a good friend of mine had a heated discussion involving nootropics (Smart-drugs) and the possibility of them invading the mass market. Considerably a null point, considering caffeine is already a wide-spread cognitive enhancer that apparently America alone consumes 45 million pounds a year of, but what is stopping scientists and researchers from developing far more effective and sophisticated smart-drugs intended for the general market? Nothing, obviously. People are already quite aware how easy is to get a hold of drugs like adderol and ritalin, as well as legal options such as Piracetam and other similar supplements. Cognitive enhancers is such an arbitrary term anyway that there's really too many examples to list.

But what I really want to know is what the problem truly is, why some people are more accepting of drugs like these while others would prefer to stay away. Some say it's an ethical choice while others would be more quick to source the stigma that surrounds each individual substance. To me however, the lines have become so blurred that I've just become downright fascinated by it. See, the brain is a complicated thing, and the way the brain reacts to different chemicals is such a diverse and complex process that I can tell quite intuitively that the science surrounding smart drugs has a long way to go, and I can say without a shadow of a doubt that further research into these drugs can be nothing but beneficial. The logic is simple, if people are going to take drugs to boost performance, might as well make sure they're more effective and safer to use.

Though, I see the whole thing as somewhat of a trend, and will probably come and go depending on the directions that mind-performance science takes in general. You have to wonder though, we live in an age where people are becoming more accustomed to instant-gratification and laziness while becoming more obsessed with general self-improvement and Brain Age training that these trends seem kind of obvious in hind-sight. Its humorous in a way, if a little sad, but I remain optimistic, as long as this type of science is moving forward, things can only look upward.

Wednesday, February 4, 2009

Trends in Technology

So apparently nanotechnology is going to take the next step in advancing Moore's Law by making transistors on a chip using atom-sized 'quantum dots', semi-conductors that transmit information-carrying electrons on a circuit.

For those not in the know, Moore's Law dictates that the number of transistors on an integrated circuit will double every 18 months. While initially this was an interesting observation, it has since become a self-fulfilling prophecy among chip manufacturers. While this greatly increases technological potential, the law is reserved only for transistors, and not cpu power in general. The interesting thing about nanotechnology stepping in is that time and time again, critics of Moore's Law have been proven wrong in the physical limitations present in how developed a chip can actually get, and now that we're getting into molecule-sized conductors... I kind of wonder if we'll somehow even pass that hurdle. If we do, I'm not saying anything is impossible ever again.

These are interesting trends though. Certain futurologists, (not to be confused with the art movement of futurism) most prominently Ray Kurzweil, have this theory called the Technological Singularity, which notices technology trends that persistently change exponentially. These accellerated changes concerning technology, energy manipulation, and artificial intelligence suggests that unless it slows down in the very near future, eventually these exponential advancements are going to reach a point where we can no longer identify or predict future developments. The idea is that basically we're going to create systems or artificial intelligences or self-replicating machines that are smarter and more complicated than we are, and they in turn will replicate into something even more complicated and advanced.

The theory is actually pretty sound. Even 10 years ago we could barely imagine the technology we have today. Computer integration and speed is improving exponentially well, and phenomenons such as social networking, web 2.0, and other information integration services are suggesting something synonymous with some sort of collective consciousness. Again, these are just extrapolated trends, but the internet is an interesting phenomenon. In a way, its mostly a continuation of the printing press, just far more efficient, faster, and more capable. Even before that, civilizations have found numerous creative ways to exchange information, homing pidgeons come to mind. So why is this trend exponentially multiplying? Is it some sort of super-natural phenomon, nature's way of developing a consciousness capable of joining itself into one super-consciousness?

Sure, it sounds silly, but imagine if you will a future where people can find any information they went, whenever they want. They have some sort of device on them at all times that is capable of finding any known and documented resource in the world, they can instantly communicate with people that are experts in a field, and can relay their findings and discoveries to others that find it relevant right away. You might even call some device like this a smartphone. Sure, connecting the world through technology is nice and all, but it's not necessarily a collective consciousness, is it? Maybe, the brain works primarily by linking relevant parts. There's a central nervous system, but essentially all the parts of the brain work together fairly limitedly. Already I feel that my own ability to know anything about our universe is limited only by my own will to go and search for that information. Social networking will soon take bigger bounds and collectively scientists, politicians, scholars, and the workforce of the world will work together to expand the technology we have today. Eventually people will be utilized more and more efficiently, hell, robots might even take over basic labor relatively soon allowing more minds to join together in this epic cause. Obama's taking the first steps of many to assure more people have access to faster internet connections and helping to usher others into a digital age, and, and... y'know, I'm getting ahead of myself hardcore here, I'm gonna stop.

This is mostly optimism on my part, but thats what trend-reading does to me. Say all this is true, is it necessarily a better future? Not really, it could be worse in many ways, but it is fascinating on more levels than just technology, I think.

Sunday, February 1, 2009

Brain adaption, it's a beautiful thing

I never felt comfortable explaining how the brain works, as the brain is an extremely complicated thing that's virtually unknowable even to experts, but never-the-less, when I read research pertaining to discoveries of the brain, I can't help but try to correlate the information to my own intuition of how my own brain works.

For example, an interesting excerpt about the function of new neurons in the hippocampus suggests that the reason neurons are generated in the adult brain (Popular science usually suggests the brain is pretty much complete and static by the time it reaches adulthood) is so that they give contextual cues to your brain for new short-term memories. The neural circuitry that develops as a result helps give the individual a general time-line for their new memories.

The interesting thing about this for me is that the brain is already equipped to deal with extremely complex things, there's virtually no limit to what the brain can accomplish, but is self-limited by the context of perception that cues the brain to function dependent on its environment. The more experiences we attain, however, the more neural pathways that are constructed between different parts of the brain and emergent memory properties are constructed. For example, taxi drivers develop a relatively large hippocampus because they not only have to remember a large quantity of locations and city details, but also how to get to any one place from any one other place. The implication is that these neural pathways are much more structurally complicated, and it's easy for me to imagine how this is relevant for other areas of our brain.

However, I would suggest that maybe it's not only the hippocampus that is capable of growing within the adult brain. Recent research heavily suggests the nearly endless potential for plasticity in recovering patients brains. (Plasticity refers to the brain's ability to use sections of the brain that were previously defined as being single-functioned, and using that part of the brain for completely different functions) People that have lost their sense of vertigo, for example, can change their brain's frame-work to use their other senses to help balance them out. This is complicated and difficult of course, but with the help of therapists and machines, is quite possible. The implications have even spread over to people with hearing loss, blindness, and other lost senses as well. So maybe if we learn enough about how the brain works and how to maximize efficiency in different areas of the brain, we might also be able to grow our entire brain by implementing certain life-style choices.

However the key-word here is efficiency. Take this guy for example, since he was 14 he's been losing huge percentages of his brain, about 50% - 75% smaller than the average brain, but he lives a relatively average life with only a slight decrease in the average IQ but by no means mentally disabled. I would imagine this person could even retain a higher IQ given his limited resources, but the fact that most brain scientists prior to the discovery of this man's brains would say that such a reduction in brain matter would be naturally incapable of basic survival speaks wonders to me about how little we know about how capable the brain is.

Anyway, enough about brains. I didn't make it clear in my first post, but I missed out on school this quarter. (I currently attend Seattle Central Community College) My procrastination led me to be too late to register for any of the classes I really needed, so I took a break this quarter altogether to help replenish my funds. (I have to pay for school all by myself, which I imagine is going to be hard on me when I transfer to a university) Not a good excuse, I know, I wish I hadn't missed the deadline and I probably could have done more to get in a class, but... February 17 is when I'll be able to register for spring quarter, I'll make sure I register the first chance I get.