Tuesday, October 30, 2007

web browsers

I wish that browsers could get a little more efficient. When you look at a normal web site, it probably has 20 to 40 links on it, and your browser seems completely surprised that you are clicking on one of those links. Why can't the computer anticipate a bit. My idea would be that all the links would start to download the html of the linked pages when you are looking at the first page. That way when you click slowly through a bunch of linked web sites, things go more quickly. The browser would give priority to the initial page, and once it is completely loaded, then it would start loading the "secondary" linked html. When I say html, what I'm trying to say is that the text of the web pages could be pre-loaded.

I'm aware that sometimes this wouldn't be much of an advantage, for web sites with 1000 links. Most pages aren't like that, though. I figure I could surf 10 times faster with such a browser. Another problem would be security. Some pages are more likely to load malware onto your computer, and you don't really want that. For that, I figure a filter, like Google uses could help the browser avoid web site domains that have bad reputations.

What else am I not thinking of?


Blogger christovich79 said...

On the facebook version of this blog, someone commented that there is a plug-in for firefox that performs this function: fasterfox.

If you google it, then I'm sure you can find it and use it. I'm using it now, and I'll let you know what I think of it.

11:35 AM  
Anonymous Marc said...

What browser are you using?

2:14 PM  
Blogger christovich79 said...

firefox 2.0

12:31 PM  

Post a Comment

<< Home