Saturday, August 13, 2005

Nifty stuff

I just found out that Jef isn't among us anymore. But also that he used to one helluva guy.

He has this enormous list of achievements that just goes on and on, and really make me feel like there should be MORE healthy, productive time to the disposition of some people. Anyways, I'll be checking out the nifty tool Archy he was developing until the very end.

It kinda promises a lot more of what I really need. Zen calm and a problem-free environment to punch out ideas, and be creative in. Less pont and click, more straight-on keyboard-punching, and standarized rules for quick handling. Oh yes. And the notion that it should be IMPOSSIBLE for the computer to 'lose' some of your work. Ever. I just observe that blogger seems to have implemented a feature that promises NOT to loose work-in-progress, wich is a HUGE improvement over regular web interfaces (see earlier post '>:( .

Oh. And this must be the best concept ever: Theory of the week! it's just a little bit dead there, but the information is still good; Brook's law: "Adding manpower to a late software project makes it later". That's a notion I can second, and a very counter-intuitive, yet very logical phenomenon.

It also explains software projects tendency to fail disastrously when they first fail. Great fun!

Wednesday, July 27, 2005

Solid state, baby, solid state!

Oh joy, oh joy! One of the more interesting piece of news I've the storage industry lately (not counting yet-to-come miracle technologies) is from [www.gigabyte.com] the next step (it's really more like a giant leap) in digital storage.

Anandtech have done a little review on Gigabyte's i-Ram, and I remain absolutely thrilled.. i-Ra isn't in itself very revolutionary, but more like a sign of things to come.

iRam is at its current version a PCI-card with 4 RAM-slots free to plug in whatever you want of ram. The exacts of this is not important, what is, is that once booted, the storage capacity of the ram will be detected as an unformatted harddrive by the BIOS. It will be free to install exactly what you want, and will for all practical purposes behave exactly like one. Except it's blindingly fast. It's so fast it should max your PCI bus, continuously, and handle multiple multiple requests well due to NO MOVING PARTS, larger than electron-size). And silent.

In short it will be just as RAM is, it just works, and usually finishes it's tasks before you notices it has even started. It will be a lot slower than regular ram, since the PCI bus is the limiting factor, and the units produced this far, have had non-optimized chip layouts.

The main feature, is that the built-in battery, will keep the iRam powered and retain data, even if the unit is unplugged for up to 16 hours (more than enough to be useful). So

The current size/price ratio is horrible, and the maximum capacity is way too small for REAL fun, but consider this; a stationary PC with fanless powersupply, extra large CPU heatsink and fanless graphics card. Add in one iRam with 4 Gb capacity for OS and temporary caching of files for playback, and use a Gigabit NIC to fetch files for playback. What you'll be left with is a mediacenter PC that makes LESS noise than your refrigerator, or even the home stereo at max volume, and nothing playing.

The only way to tell if it's on, is to check the LED's, or observe infrared heat emission. It's sooo sweet I can't even begin to explain all the nuances of GOOD this means. But if this gets some time to become more usable, and the price/capacity ratios come down, and the performance of all 'silent' components are ok, then home mediacentre solutions with solid state clients and media content servers (sooner or later super-broadband providers' serverparks can do this job), will become really popular combo.

Saturday, May 28, 2005

Digital nostalgia, where will we be in 100 years?

As I just happen to have enjoyed a little less tech-influenced style of living, I haven't used my old fulltower PC in about a year. As i visited my parents this weekend I stumbled across an 'old' hard drive (it's not that it's antique or anything, it just hadn't been powered on for about a year, and had 111Gb of my stuff inside). I went through some of the contents (120Gb is a LOT of bits). Wham! Instant nostalgia! I hit pictures, music and software that were at least one year, and more often than not, much older than that.. It brought back memories, and made me go through lots of things that happened earler when i used the disk the most, and even 5 year(plus) old music that touched some emotional nerve.

It made me think, that hard drives, and other digital storage media are like the diaries and photo-albums of the digital age. It follows that with this new role (it isn't exactly new.. as all digital activity, have been kind of a memory box for whoever were working with particular types of sotware..).

It also follows that this role is a new, and maybe even more demanding challenge for the computer. "Family Critical Computing" is the new trendy word. This is a tough nut for home computers, since most of the basic relability of this system depends on the harddrive to run error-free.

It is also interesting to think about what might be ariund in 100, or maybe 5 years from now.. *sighs*... good old summer music, what i really need, is for it to "stay forever young".. I'll put my money on regular backup routines and open formats.

For future "Family Critical" purposes, I'll consider using some off-site storage where I can buy space at reasonable rates. For now Gmail will do.

Wednesday, February 02, 2005

Moore: exponential growth of Goodness!

Moore's law.

It's a concept that I like very much. It gets more and more.. interesting, the more I think about it. It' widely known, but gets much less hype than it really deserves.
Think about this; you might equal the effects of mr Newton's law of gravity with Moore's law.. To push a slightly ambitious analogy, just consider only one difference between these; the medium they affect.. Gravity affects anything that has mass and is a force that accelerates objects with mass toward each other..

The most notable effect of this law is that if you let go of your cup of coffee it will most likely, depending on your location, accelerate toward the gravitational center of the earth.

Moore's law is not a law of nature, it is more a close relationship between the fundamental production process of transistor-based technology, the development of these methods, and economical laws of supply and demand, that have, in a so remarkably linear way, scaled down the size of transistors so they shrink about 50% in size every 18 months. This is not a simple predictable development that can go on forever (and should really be named 'Moore's tendency' to be more correct)

The immediate effect this has is to continuously affect the size, price, and the raw material cost per-transistor in micro-chips. This translates nicely into the daily life so that anything that can be done with a transistor-based chip, WILL be done with a transistror. And every 18 months these chips WILL be smaller, more powerfull and less hungry for electricity and in general better suited for whatever purpose you can put them to. With the increasing computational- , and micromanaging power disponible on a global scale, this will be a force that accellerates the flow of information, and speed up the automation of machines, and virtually every other process that can be controlled or enchanced by microchips.

This could be percieved as 'nothing unusual', given that this development is an everyday phenomenon, just like gravity it's nothing much to get worked up about, it's just there, right? The coffe cup hits the floor, and new line of computers hits the store every season time and time again. Right?

Wrong!

Even though these two phenomenons are very difficult to compare, there is one difference. Both forces can be said to accelerate something, either matter, or 'human technolgical development'. But the most striking difference is that the acceleration caused by gravity is very abruptly ended when the gravitational objects finally impacts into each other, (or, if we're talking the inter-planetary level here, enters an orbit of some kind). But, Moore's law, if it holds true, and microchips continue to develop and increase in efficency, the acceleration of human development, will NOT stop, until the Moore constant is declining. There are many, many arcicles which discuss the future of the Moore constant, the rate of increase or decrease, or (not unlikely) be completely obsoleted, by a new fundamental computational paradigm change. Who knows..

But for the mental excercise, just consider the possibility that Moore's law might predict the future, for as long as the next 100 years (not very likely, but still)..

The effect would undoubtly be that everything you ever imagined, that involved microchips, will be reality, and very quickly, become yesterday's news. Everything chip-controlled will become increasingly small, until the very barriers imposed by fundamental physics rules will protest, and then things will become more and more complex, still pushing the rules of the physical playground. Just imagine something being as small as you can imagine, and then you will probably be in the right neighborhood, but still a tad too big.

When thinking along these lines of miniaturization, a new and exiting development will also take place, moved by the same economy/technology push and pull as the computer advances are governed by: the next obvious step will be to integrate the advances in computational and ultra small technology with the most complex mechanism yet known to humanity, man himself...

Try this. Imagine the Darwinean evolutionary theory of the journey from mokey to man mixed with the consequences of Moore's law..... done thinking?




I wanna be a cyborg.

Monday, September 20, 2004

O' web interface, thy blessings be ever so many, BUT..

[metablog warning!]

While web interfaces are accessible, platform independent and mostly reliant
on open standards, and gererally quite handy to use, one BIG problem is the uncanny ability to eat a big potion of work (think 'homework' and 'The Dog'), and still be hungry for more.. One distraught click, and GONE.

"Nonononoo.. I didn't want that! Escape! Regret! Undo! *click-click-click*"

Some more navigating, reloading and restless flickering back and forth through the website, before the bitter realization that the text is beyond rescue, AND that this probably is caused solely by my own very poorly guided action. With the assistance of a interface without the necessary foolproofing, and a solution lacking a basic undo/rollback feature I have given myself a mental equivalent of a good, healthy punch in the face. Stupid, stupid!

Security features like session timeout, being used to avoid unauthorized access to shared computers, have many, many hours of my creative work on its conscoiusness.

This is a source of intense negative vibes in my case, because of the sheer pointlessness of what has happened before my eyes. Written material, of reasonable quality, just disappear by accident of navigation. Here I, the writer, spend some of the most valuable rescource known to man, fractions of my own lifetime, refining, considering and putting to print my own words.

And then, for no good reason the words are lost. Without even a hint of purpose to their brief existence, there have been no audience, no transfer and mating of memes , no good done, with the possible exception of joy on the authors part, however fleeting, derived from work well done.

The text can wery well be re-created, maybe in a even better way than the initial, but the writers motivation, mood, and all stress-related physiological processes take a serious hit.

Data loss in general, and those caused by web interfaces in general, sucks. The only vaguely positive data loss experience I've ever had, was the rapid, successive, catastrophic loss of content on every hard drive in my posession. Weird enough, it felt therapeutic.

The conclusion of this rant must be, beware of data loss, keep backups and save always!

Saturday, September 04, 2004

First posting, all set and.. POSTS AWAY!

Why the silly title? You might wonder what kind of message I'm trying to broadcast, but hey, it's probably more than one.

First of all i really like the sound of it, My Castle. It's all mine. And I can do whatever I want around here. Probably without anybody taking notice, lest I be loud enough, and invite people over, ever so often.

I'm online pretty often. The majority of my friends are online too, and not in the "what's-new-in-the-online-version-of-the-local-paper-today,-maybe-some-porn?"-wussy kind of way. We go online to socialize, to seek knowledge, software(legal and, to a varying degree, not-so-legal), news sources of our own preference, even sexual satisfaction (though few make this a public activity, SOMEONE has to fund the multi-billion industry of online porn, right?). The internet is a MAJOR medium, with a considerable impact on our lives.

Me, for one, myself dabble with the idea of getting disposable income from work and operations, in some way or another, connected to the great big internet. That's a quite thorough adoption of, and a heavy reliance on a technical solution, like the internet is.

This may seem stupid, but what I find most remarkable about this is that EVERYBODY does this. It's nothing special about this for a lot of the people living in the good old civiliced part of the world(no offence to countries with non-functioning TLD's like .er). The growing percentage og people, on a regular, or constant basis, connecting to a global, digital network, is so huge that people did not even bother to imagine the effects of a only 20 years ago.

And and as a firm believer in Moore's Law, I will have to say 'you ain't seen nothing yet!' about this developing trend.

Forget everything about the 68'ers, the dessert, the X and Y generation, it's the online generation that counts.