Grandpa passed today, after loosing his fight with pancreatic cancer.
See you when I’m dead. Till then, miss you Grandpa.
It’s probably about time I got around to tapping out another book review post. So, here goes.
I’ve just finished this one. It’s an interesting book, although slightly confused. It starts off with a serial murder, introducing the main antagonists slowly. The murders, however, are not the books real aim, you know ‘who-dun-it’ by 75% of the way through. It’s more a social/character discourse, with elements of romance mixed in.
The book does start slow, because of the focus on inital character building. However, it doesn’t grab me as similar books have, I assume because I have already become too used to that world, so am not being introduced to much I’ve no experience of.
The book did annoy me though, for one abuse of technology-speak. Yes, techy-speak may be confusing for those not used to it, but email is not passed through more than two UAs, and normally no more than 2 MTAs; a UA is a User Agent, i.e. your email client, so Outlook, Thunderbird, or your web-mail. MTA stands for Mail Transfer Agent, and is one of the computer programs email is passed through as it transits the internet. Specific abuse stems from trying to fill out a 1-dimensional computer tech character.
The book finishes on a slightly strange note, murders solved, good guys slightly victorious (with requisite battering), but the relationships we’ve been following throughout not yet tied off or stable. I assume this is leading to a possible follow up book, but it just annoys me. It’s one reason I like Elizabeth Moon’s work, she does tie up immediate stuff, and leaves characters stable, whilst allowing the greater narrative to hang for the next book.
It’s a good enough read, though not one I’d particularly recommend or lambast.
This is the second in a trilogy from one of the masters of sci-fi/fantasy. It’s basically a sci-fi romance, but the story is enthralling. One in my permanent collection, this would have been a third re-read. Follows Killashandra as she leaves Ballybran to save a lover’s life, gets caught up in planetary political wrangles, and finds a love she’ll never forget. If you’ve not read this series, and are into Sci-Fi, I’d recommend picking it up. First in the series is ‘Crystal Singer’, Killashandra is followed by the finale, ‘Crystal Line’.
Meh. I finished this book in a day. It’s not a hard read. I saw one of the major twists coming too early, and it’s a pointless boring twist. I can think of at least 3 different better ways to finish this plot.
Basically, read if you want another bond story, but don’t expect it to be a particularly challenging or rewarding read. Of course, the good guys win, but I do like in when they win in a bit more style!
That’s the books I’ve read this week. Now I’m starting to run out of fresh ones. I’m going to be mainly re-reading stuff I guess. Any suggestions in what to look out for gratefully received. Sci-Fi/Fantasy is my preference, but I’ll read almost anything apart from horror. (And mills and boon types.)
Web 3.0 is coming soon…
Linking
IMHO the Web 3.0 revolution will consist of websites and web apps from the 2.0 era becoming closer.
I think that it will become easier to link together content across web sites to create new forms of content.
In the Web 2.0 revolution was helped by blogs with authors linking together information in posts. (This I might add has been very useful to combat the slew of dodgy sites that sit high in Google’s results but just spit back the search terms as results, nullifying your search. Nowadays I find use ‘blog’ in search terms, especially when looking for reviews.)
I can’t wait until someone puts together a really good way of visualizing all this data. As the internet grows the importance of being able to sift through the available data and collate it into collections on particular topics is becoming paramount.
I have been looking out for a system to visualize my internet links in some kind of subject oriented way with a timeline / time axis. So far the only thing that comes close is Basket Notes for KDE (screenshots). If only that were a web app! (if i had the motivation and focus, I’d turn my meagre php programming skills to that task myself, but alas like my sketched design for a social networking site written in my design book pre the advent of facebook, I think I’ll leave it to someone else!)
I guess the closest web based similar system (I’m aware of) currently in operation is Wikipedia!
Retrieval
Look at the useful plugin Ubiquity, and the fantastically useful cross platform application and search launcher, Launchy for example. Both of these are designed to give us quicker access to and search abilities for our data.
Workflow
Making computers integrate seamlessley to our lives rather than inturpting them.
Today the focus of computing is shifting from _ to the workflow -how we get things done. I think this is essential because your average end user doesn’t care how things get done, just as long as they can get done.
Digital Photographers often use a prescribed workflow when working on digital photos – ‘developing them’ as it were to bring out the best. PCPro Magazine suggests 1. Levels and Curves then 2. Colour adjustment followed by Sharpening. But I’m talking more than just the best sequence of events to achieve the best quality output. I’m talking about the process itself.
Our brains think sequentially, each action is broken down step by step and steps performed one after another. A break in our concentration, or ‘flow’ impacts our effectiveness. This is especially true for people with ADHD (like me). Reducing the need for context switching.
“Consider that it takes 15 minutes for a developer to enter a state of flow. If you were to interrupt a developer to ask a question and it takes five minutes for them to answer, it will take a further 15 minutes for them to regain that state of flow, resulting in a 20 minute loss of productivity. Clearly, if a developer is prevented from flowing several times during the day their work rate declines substantially. “
(Retrieved from http://softwarenation.blogspot.com/2009/01/importance-of.html)
For example, downloading pictures from your digital camera and uploading them to facebook. Recently I’ve been using ‘Windows Live Photo Gallery’. Ugh, I know, but the point is it that Vista offered it to me, and it was an easy to find and add plugin that allows me to upload direct to facebook, where most of my photos end up these days.
To download the pictures I simply flip out the SD card from my camera, and insert it into my laptop (useful laptop buying advice)’s SD card slot
And that’s the point, people will take the path of least resistance/effort.
Path of least effort Principle
According to my observations
like people walking down the high street striving to avoid collision with other pedestrians, my observation leads me to believe that everybody is operating on the principle of least effort, where the person you are approaching will attempt to take a path that will need the least amount of diversion from their original path in order to avoid collision, while you yourself will attempt to do the same thing.
how does this come back to web 3.0?
How many clicks does it take while searching for some long forgotten but relevant piece of information before a user will get bored and move on? [research advertising, google hotspots, number of clicks] Could it be as low as 3, and as high as 8?
Unified User Interface
Facebook for example. I was trying to find my note on laptops to include a link in this article, but alas my click on Notes from the home page only brought up a ‘feed’ of Notes. Where I ask is the Filter options that preside on everyone’s profiles? Why can’t I select ‘Just Garreth’ here too?
If something like that is useful, it should also be Unified, that is available everywhere!
In the time it took me to discover the ‘workflow’ to access my notes in this ‘fast/bitesize/information obsessed’ age my poor overloaded ADHD (video: ADHD impact on life) brain might easily have become bored frustrated and more importantly distracted and moved on…
Availability
Cloud computing and Rich Web Applications (Blog: Google and Rich Web Application)
Organisation of Data
TOC
Concise
It’s an inverse law – As our attention spans decrease, so the conciseness of the data we consume must increase ceterus paribus.
Why do my spidey senses tell me facebook, not google may be the winner in the Web 3.0 revolution?
What do you think? Leave some comments of your vision, and what you think of my ideas.
Just a silly short post about a beep song i was making while waiting for a partition resize to go through.
This should run on pretty much any linux system, just copy and paste 😉
beep -f 1000 -n -f 1500 -n -f 600 -n -f 500 -n -f 100 -r 2 -l 10 -n -f 50 -r 2 -l 200 -n -f 40 -r 2 -l 300 -n -f 60 -r 3 -n -f 50 -r 3
Thanks gparted and Sytem Rescue Cd (Linux)
Please continue my little ditty in the comments!
Ps modern computers may need speakers plugged in and on to make the magic happen, but shouldn’t need sound drivers.
Enjoy!
Edit: check out the followup post here: https://kirrus.co.uk/2014/02/linux-beep-music-2/
So the windows version of ping is really stupid.
I was writing a batch script to mount up a network share that involved checking to ensure my NAS unit was turned on. The script is scheduled to run after the computer resumes.
What I found out is that the built in version of Ping.exe is terrible at telling you whether the ping has returned successfully or not. I was checking the ERRORLEVEL – %ERRORLEVEL% variable to find out what ping was returning. It should be 0 for success and 1 or higher for a failure.
What I found was, i was getting replies from the local pc (dunno why, leave me a comment if you know) and ping was reporting a success even though the correct pc failed to reply. The solution?
Replace the Windows ping.exe with Fping. It has a lot more options and appears – from some initial quick tests – to correctly report the errorlevel.
Kudos to Wouter Dhondt for developing it. I’ll update this post with any more news!
Back in June of this year, PC Gamer launched a new website. This website design appears to be a rip-off of that used by Rock Paper Shotgun. With all the images that follow, click through for a larger version.
But, let’s roll back shall we? Rock Paper Shotgun launched September 2007, though their first post goes back to July 2007. They were a novel pc gaming blog site, trying to do something different in the gaming scene. They concentrated on PC games and only PC games, with running jokes. They have a small enough set of writers, that you can pick up the personality of each. (Kieron takes the weird ones, VERY NSFW: example.)
Back in 2007, pcgamer.co.uk redirected to a sub-site of www.computerandvideogames.com. Since then, they haven’t altered the design at all. Now, it redirects to pcgamer.com. Looking at the two reveals this:
As an ex-web-developer, it looks to me like someone decided that they quite liked the RPS type website and went ‘make me a website like that, but in this style’. And tweaked the mock ups (and site designs) a few times, till what they had looks remarkably like what we see now.
Saying that, of course, this is quite a standard design style. It comes quite often easily when you use WordPress as your back-end engine, as this blog does, and as RPS does. However, they’ve not just used the site layout of wordpress as a base, they’ve decided to publish all of their posts in the same sort of format as RPS, with the same aim at getting discussions around their posts via the commenting.
A little birdie 1 tells me that someone at future (the company behind PC gamer) really might hate Rock Paper Shotgun. Would rather they disappear. It’s almost like, they’ve finally decided to fight this sphere of influence, with money, and lots of people, finally decided that maybe their website is worth working on and taking care of.
What annoys me, is that the big guy is trying to kill the little guy 🙁
Here are a whole load of screenshots, save you finding them. Some are from Wayback machine, some are from the website directly.
The old website, up till June. This image was recovered with a lot of hard work from webpigeon, of unitycoders.co.uk, (thanks!) since PC gamer used some really horrible website coding, which broke the waybackmachine copy. This has to be one of the ugliest websites I’ve seen, though not the worst. You could switch the big image, and below it was a list of recent stories.
And, if you scroll down a bit..:
They seem to be trying to throw links at you, lots and lots and lots of them, in a really small space. Check it out for yourself.
Rock Paper Shotgun’s footer:
PC Gamer’s footer:
OOo… don’t they look similar? Apart from the ‘we must keep up with the cool kids’ twitter panels and lots and lots of post links (which RPS doesn’t force on you, or puts in the right hand panel). This mess could also be due to Search Engine Optimization, that dark art in which you try to trick search engines into putting you higher up on their listings than your arch rivals.
Now, I work for the company that keeps RPS online. I like the guys that work there, I think they do a good job, especially considering they’re not getting paid much from it.
Also interesting, is the fact that PC Gamer seem to have thrown money at this venture. I work with some high-load wordpress-powered sites, and there is some very obvious things you do to make them work fast. Very fast. PC Gamer isn’t doing at least one of the most obvious, which suggests that instead they’ve thrown cash at keeping it online, with a cluster of computers working on it. Don’t know how a website works? Find out here 2
A forum-friend provided this recipe, when I asked for suggestions for a spare chicken breast (other one went into curry.) Very yummy indeed, thanks!
Season chicken breast with salt, pepper and your favourite spice blend.
Bake or grill it. (About 30-45 minutes at 175-200 degrees C)
Once cooked let it rest and start to cool, whilst you prepare the bits to go with it.
Smear mayonnaise and mustard on two slices of bread. Slice the chicken thinly into strips. Then add lettuce leaves to the bread, the sliced up warm chicken, and top with tomato slices.
I mixed the order, adding the chicken and then the lettuce, but it still tasted nice:
Sorry about the rubbish photo. Taken on my phone.
Currently, according to mainstream media, bandwidth is defined as the quantity of data you download or upload to the internet over a month. So, for example, your ISP will tell you the maximum bandwidth limit is 100GB. Or whatever.
That, however, is not it’s true definition. It’s true definition is:
a data transmission rate; the maximum amount of information (bits/second) that can be transmitted along a channel 1
This is the secret thing about bandwidth. ISPs don’t care about how much you upload to the web over a given period. We care about how fast you upload it.
When you pay for a high-level connection to the internet, that you use to connect houses to, or web-serving computers, you do not pay in quantity over time. You pay in speed. So, for example, 1 gigabit per second. If you go over that speed, longer than a allowed ‘burst’ period, you pay an overage charge, always assuming that your network is even capable of going over that speed.
Think of bandwidth like gas going through a pipe. (Terrible, terrible analogy, I know. But it’s the easiest way to explain.) That gas can only flow so fast, and only so much can be fit in the pipe at any one time. We don’t particularly care if you use 100GB by taking a trickle out of the system at any one time. We do care if you take a torrent.
Realistically though, customers never notice bandwidth. They’re too busy playing with computer-resource hungry things, like wordpress, to even be able to consume all of their allocated bandwidth. Only very, very rarely do we actually start thinking about bandwidth rather than computing resources. Normally, it’s podcasts. Static file. Almost no server-resources required to send it out onto the internet. But it eats bandwidth. Most are ~50-80Megabytes per episode. You get enough people downloading that simultaneously, and we’re going to start noticing…
As long as the current trend continues, i.e. the more computing power we have available to provide you with your shiny websites, the more the people creating the shiny websites waste computing power, the mainstream will never notice this secret.
More often than not, the reason we ask people to upgrade off our shared servers, is not because they’ve reached any arbitrary bandwidth limit, although we may use this as a guide to identify them. It’s because they’re using too much CPU time.
I know that someone stole from the charity shop today. Found the remnants of a plastic tag broken by teeth on the floor in the changing room. Thought there was someone doing something suspicious in there earlier, but got distracted by people paying.
Not the first time either, we had a set of known thieves three weeks ago, think they probably suceeded, someone found a destroyed tag outside the shop.
I wonder, do they steal from need, from dependence on stealing, or for the excitement, the thrill of the crime?
Having never stolen anything physical, to knowledge, I don’t know.
Guess this will be one area of curiosity never sated. I just have to keep an eye out for them.
Photo below, Gerald the Giraffe enjoying a glass of coke, having just been rescued from the kidnapping admin team at one of our offices 😉