Thursday, September 11, 2008

Google's Decade

Ten reasons why Google is still number one.
Friday, September 12, 2008
By David Vise



David A. Vise is a Pulitzer Prize winner and coauthor of The Google Story: Inside the Hottest Business, Media and Technology Success of Our Time, published by Delacorte Press (Updated edition 2008).

1998: Yahoo and others pass on the chance to buy new search technology developed at Stanford University for $500,000. Their rationale: "Search doesn't matter. Portals do."These rejections forced Sergey Brin and Larry Page to reluctantly take a leave of absence from Stanford (both wanted to become college professors, like their dads) to see if they could turn Google, their new search engine, into a business.

1999: A year later, Google garners $25 million from Sequoia Capital and Kleiner Perkins, even after Brin and Page insist that each venture-capital firm would only be allowed to invest half the money. This "divide and conquer" strategy leaves the denizens of Sand Hill Road to squabble with one another while Brin and Page retain control over their nascent enterprise. It also enables them to frustrate the moneymen by refusing to deface the most valuable piece of real estate on the Internet--the Google home page--by loading it up with ads, and by turning away numerous CEO candidates.

2000: Yahoo chooses Google to power searches on its portal, exposing the search engine to millions of users for the first time. With growth accelerating, Google benefits from the bursting of the technology bubble since this allows it to hire talented engineers for lower wages.

2001: Israeli entrepreneur Yossi Vardi persuades Brin and Page that Google can profit by drawing a thin blue line down the page, putting organic search results on the left, and placing small text-based ads on the right. Brin and Page previously resisted ads, fearing that they would cause people to lose confidence in the integrity of Google's search results. While at Stanford, they coauthored and delivered a paper called "The Evils of Advertising."

2002: Google cuts a deal with AOL to power search and syndicate ads, giving it access to 34 million America Online customers, and inks other similar pacts. Revenue increases 500 percent and profits skyrocket to nearly $100 million.

2003: Google launches an automated program to deliver contextually relevant text-based ads to hundreds of thousands of affiliated websites, creating a self-reinforcing network of support for the company and giving Internet publishers an easy way to profit by letting Google display ads on their pages. This strategy makes Google reminiscent of the three major television networks in their glory days: give away the programming for free, and profit from ads displayed on the network of affiliates.

2004/2005: Google raises billions of dollars in an unconventional initial public offering and a successful secondary offering, with Brin and Page retaining absolute control by owning shares with super-voting rights. In the process, they drive Wall Street professionals crazy by forcing investment bankers working on the deal to sign confidentiality agreements at every single meeting, rather than only once. With "Don't be evil" as the company motto, Brin is anointed as arbiter of good taste, deciding what does, and does not, fit. Beer and cigarette ads are out; wine ads are in.

2006: Google buys YouTube, giving it control of the world's most popular online video site. Google bets big on the future of video, paying $1.65 billion, even though YouTube has no sales. The view is that Google will capture the eyeballs and mindshare now and figure out how to make money later.

2007: Brin and Page both marry brainiacs at destination weddings, ensuring the pedigrees of their progeny. Brin and Page also reaffirm their private pact to work together for at least another decade atop Google. And with the stock price soaring, the duo breaks into the elite ranks of the top five on the Forbes 400 List of the Wealthiest Americans, with a net worth exceeding $15 billion each.

2008: Google's 10th birthday coincides with the 10th anniversary of the Justice Department's antitrust suit against Microsoft, which forced the giant to compete with one hand tied behind its back while Google raced ahead. Now Google is replacing Microsoft in the crosshairs of the Justice Department's antitrust division, due to the dominance of its online ads and search, and a proposed ad pact with Yahoo, its biggest competitor.


Source : http://www.technologyreview.com

A Guide to RSS Aggregators

by: Terry Leslie


One of the most popular features of Internet portals, websites, pages and even emails is a frame that features an organized list of news headlines and periodic updates from other web sources. Really Simple Syndication, formerly “Rich Site Summary” or simply, RSS makes this possible.

Most users visit a lot of websites whose content continually change, such as news sites, community organization or professional association information pages, medical websites, product support pages, and blogs. As Internet surfing became an intrinsic part of business and leisure, it became important to get rid of the very tedious task of repeatedly returning to each website to see updated content.

RSS easily distributes information from different websites to a wider number of Internet users. RSS aggregators are programs that use RSS to source these updates, and then organize those lists of headlines, content and notices for easy reading. It allows computers to automatically retrieve and read the content that users want, then track changes and personalize lists of headlines that interests them.

The specially made computer programs called “RSS aggregators” were created to automatically find and retrieve the RSS feeds of pre-selected internet sites on behalf of the user and organize the results accordingly. (RSS feeds and aggregators are also sometimes referred to as "RSS Channels" and "RSS Readers".)

The RSS aggregator is like a web browser for RSS content. HTML presents information directly to users, and RSS automatically lets computers communicate with one another. While users use browsers to surf the web then load and view each page of interest, RSS aggregators keeps track of changes to many websites. The titles or descriptions are links themselves and can be used to load the web page the user wants.

RSS starts with an original Web site that has content made available by the administrator. The website creates an RSS document and registers this content with an RSS publisher that will allow other websites to syndicate the documents. The Web site also produces an RSS feed, or channel, which is available together with all other resources or documents on the particular Web server. The website will register the feed as an RSS document, with a listed directory of appropriate RSS publishers.

An RSS feed is composed of website content listed from newest to oldest. Each item usually consists of a simple title describing the item along with a more complete description and a link to a web page with the actual content being described. In some instances, the short description or title line is the all the updated information that a user wants to read (for example, final games scores in sports, weblogs post, or stock updates). Therefore, it is not even necessary to have a web page associated with the content or update items listed -- sometimes all the needed information that users need would be in the titles and short summaries themselves.

The RSS content is located in a single file on a webpage in a manner not very different from typical web pages. The difference is that the information is written in the XML computer code for use by an RSS aggregator and not by a web user like a normal HTML page.

There are 2 main parts that are involved in RSS syndication, namely: the source end and the client end.

The client end of RSS publishing makes up part of the system that gathers and uses the RSS feed. For example, Mozilla FireFox browser is typically at the client end of the RSS transaction. A user’s desktop RSS aggregator program also belongs to the client end.

Once the URL of an RSS feed is known, a user can give that address to an RSS aggregator program and have the aggregator monitor the RSS feed for changes. Numerous RSS aggregators are already preconfigured with a ready list of RSS feed URLs for popular news or information websites that a user can simply choose from.

There are many RSS aggregators that can be used by all Internet users. Some can be accessed through the Internet, some are already incorporated into email applications, and others run as a standalone program inside the personal computer.

RSS feeds have evolved into many uses. Some uses gaining popularity are:

•For online store or retail establishments: Notification of new product arrivals
•For organization or association newsletters: title listings and notification of new issues, including email newsletters
•Weather Updates and other alerts of changing geographic conditions
•Database management: Notification of new items added, or new registered members to a club or interest group.

The uses of feeds will continue to grow, because RSS aggregators make access to any information that individual users like more convenient and fun.

In the mean time, Good Luck on your journey to success…

OR if you would like to succeed immediately to create financial freedom working only 4 hours a week, check out http://www.Secrets2InternetFortunes.com.

AND for a Limited Time, you will also receive a FREE copy of a limited number of the amazing 60 page eBook “52 Highly Profitable Instant Online Business Ideas That You Can Steal As Your Own And Start Today On A Very Tight Budget!”, which is jam packed with so many ideas you can use to instantly create an automated income for life! That’s my GIFT to You as a way of saying thank you for reading my articles.




About The Author

Terry Leslie is a very successful and world renowned authority figure in both on and off line marketing, and rapid business creation. A much sought after global speaker in the areas of internet marketing, business development, self-improvement and human peak potential training.

For more Secrets to Internet Business success,
check out http://www.secrets2internetfortunes.com

A Guide on RSS Tool

by: Terry Leslie


RSS is an abbreviation that has evolved into the following, depending on their versions:

• RDF Site Summary (also known as RSS 0.9; the first version of RSS)
• Rich Site Summary (also known as RSS 0.91; a prototype)
• Really Simple Syndication (also known as RSS 2.0)

Today, RSS stands for 'Really Simple Syndication', and it has the following 7 existing formats or versions:

• 0.90
• 0.91
• 0.92
• 0.93
• 0.94
• 1.0
• 2.0

RSS tools refer to a group of file formats that are designed to share headlines and other web content (this may be a summary or simply 1 to 2 lines of the article), links to the full versions of the content (the full article or post), and even file attachments such as multimedia files. All of these data is delivered in the form of an XML file (XML stands for eXtensible Markup Language), which has the following common names:

• RSS feed
• Webfeed
• RSS stream
• RSS channel


They are typically shown on web pages as an orange rectangle that usually has the letters XML or RSS in it.

RSS feeds can be used to deliver any kind of information. Some of these 'feeds' include:

• Blogs feed - each blog entry is summarized as a feed item. This makes blog posts easier to scan, enabling 'visitors' to zoom in on their items of interest.

• Article feed - this alerts readers whenever there are new articles and web contents available.

• Forum feed - this allows users to receive forum posts and latest discussion topics.

• Schedule feed - this allows users (such as schools, clubs, and other organizations) to broadcast events and announce schedule changes or meeting agendas.

• Discounts or Special feed - this is used to enable users (such as retail and online stores) to 'deliver' latest specials and discounted offers.

• Ego or News Monitoring - this enables users to receive 'filtered' headlines or news that are based on a specific phrase or keyword.

• Industry-specific feed - used by technical professionals in order to market, promote, or communicate with current (and prospective) customers and clients within their specific industries.

RSS feeds enable people to track numerous blogs and news sources at the same time. To produce an RSS feed, all you need is the content or the article that you want to publicize and a validated RSS text file. Once your text file is registered at various aggregators (or 'news readers'), any external site can then capture and display your RSS feed, automatically updating them whenever you update your RSS file.

RSS tools are useful for sites that add or modify their contents on a regular basis. They are especially used for 'web syndication' or activities that involve regular updates and/or publications, such as the following:

• News websites - as used by major news organizations such as Reuters, CNN, and the BBC.
• Marketing
• Bug reports
• Personal weblogs

There are many benefits to using RSS feeds. Aside from being a great supplemental communication method that streamlines the communication needs of various sectors, RSS tools and feeds can also have tremendous benefits in your business, particularly in the field of internet marketing.

RSS tools and feeds provide Internet users with a free (or cheap) and easy advertising or online marketing opportunity for their businesses. Below are some of the RSS features that can help make your internet marketing strategies more effective.

1. Ease in content distribution services. With RSS, your business can be captured and displayed by virtually any external site, giving you an easy way to 'spread out' and advertise them.

2. Ease in regular content updates. With RSS, web contents concerning your business can now be automatically updated on a daily (and even hourly) basis. Internet users will be able to experience 'real time' updates as information in your own file (such as new products and other business-related releases) is changed and modified simultaneously with that of the RSS feeds that people are subscribed to.

3. Custom-made content services. With RSS, visitors can have personalized content services, allowing them total control of the flow and type of information that they receive. Depending on their interests and needs, visitors can subscribe to only those contents that they are looking for (such as real estate or job listings).

4. Increase in (and targeted) traffic. With RSS, traffic will be directed to your site as readers of your content summary (or 1 to 2 lines of your article) who find them interesting are 'forced' to click on a link back to your site.

These are just several of the many things that you can do with RSS. The possibilities are endless, and they are all aimed at providing you with an effective internet marketing strategy for your business.

In the mean time, Good Luck on your journey to success…

OR if you would like to succeed immediately to create financial freedom working only 4 hours a week, check out www.secrets2internetfortunes.com.

AND for a Limited Time, you will also receive a FREE copy of a limited number of the amazing 60 page eBook “52 Highly Profitable Instant Online Business Ideas That You Can Steal As Your Own And Start Today On A Very Tight Budget!”, which is jam packed with so many ideas you can use to instantly create an automated income for life! That’s my GIFT to You as a way of saying thank you for reading my articles.


About The Author

Terry Leslie is a very successful and world renowned authority figure in both on and off line marketing, and rapid business creation. A much sought after global speaker in the areas of internet marketing, business development, self-improvement and human peak potential training.

For more Secrets to Internet Business success, check out http://www.secrets2internetfortunes.com

Wednesday, September 10, 2008

A Network That Builds Itself

Ad-hoc wireless networks may soon tell emergency workers how to deploy transmitters.


By Michael Fitzgerald



Building an on-the-fly wireless communications networks is a vital part of firefighting, handling hostage situations, and dealing with other emergencies. But it is difficult to build such networks quickly and reliably.


Soon these emergency wireless networks could help build themselves. The National Institute of Standards and Technology (NIST) recently presented details of two experimental networks that tell emergency workers when to set down wireless transmitters to ensure a good signal.


Ad hoc wireless networks relay messages between transmitters, or nodes, without requiring any central control. But as things stand, emergency workers simply follow suggested guidelines for building such a wireless network--placing each node 15 or 30 meters apart and at key points, like the corners of a building. Or they periodically check back with the command center to make sure they're still in touch. Neither method is terribly efficient in an emergency, however. The process can also be costly if a large number of nodes are used.

The NIST prototypes, which have been under development for more than three years, use algorithms to monitor the signal-to-noise ratio of transmissions and automatically warn when a new node should be set down.


"We didn't want to have fixed rules, because there can be a lot of metal in walls or cinder block," meaning signal strength varies building to building, says Nader Moayeri, a senior technical advisor in NIST's Advanced Network Technologies Division. "Plus, you don't want to deploy too many, because of the cost factor as well as potential for communication delays."


Moayeri says that NIST considered having nodes ping each other with short messages to see how many packets of data were lost in transit. The problem with this approach is that the person deploying the network would not detect a weak connection immediately and might have to backtrack. Having an algorithm measure the signal-to-noise ratio instead avoids this problem and provides a clearer picture of connection strength.


NIST built two prototype networks using off-the-shelf hardware. One operates at 900 megahertz and uses Crossbow MICA2 Motes to transmit radio signals. The other, a Wi-Fi network operating at 2.4 gigahertz , uses Linux-based Gumstix transmitters. But Moayeri says that the NIST algorithm should work with any wireless hardware and on any available spectrum.


In the Crossbow system, each node has an LED that automatically changes color, from green to red, when a new node needs to be set down. The Gumstix system issues alerts via a handheld or tablet computer connected to the same wireless network.


Each hardware platform has different strengths and weaknesses. The Crossbow system can be customized easily but has a maximum data transfer speed of 35 kilobits per second, limiting the network to text messaging. The Gumstix system is less flexible but can transfer data at 54 megabits per second, allowing users to talk and send other data over the network. Both types of node measure approximately five by ten centimeters and cost between $200 and $300.


Moayeri's team tested the Crossbow network in an 11-story building on the NIST campus in Gaithersburg, MD, deploying 11 nodes in the stairwell. The Gumstix network was tested throughout another NIST building that goes 40 feet belowground and features winding corridors as well as a number of metal doors. A total of eight nodes were used to cover about 300 meters.
Moayeri says that the maximum transmission power for the Gumstix node was about 100 milliwatts while the Crossbow's MICA2 Mote was approximately three milliwatts. Since a typical police or firefighter radio transmits at one to five watts, far fewer nodes would be needed in a real-world scenario. However, it's not clear how much it will cost to make rugged and fireproof nodes.


A potential downside of the NIST prototype is that it does not include the ability to track location, unless it is in a building that already has passive RFID chips installed.
Moayeri and his colleague Michael Souryal presented details of the two prototype networks at the third annual Precision Indoor Personnel Location and Tracking for Emergency Responders technology workshop held at Worcester Polytechnic Institute in early August.


Their presentation caught the interest of one workshop attendee--Alan Kaplan, chief technology officer at Drakontas, a company based in Glenside, PA, that makes communications software for public safety and security operations. His firm's software currently requires users to check connections between nodes as they are deployed. "What I thought was cool is that the technology seemed to help users as they built out this network, telling where they should actually place these nodes," says Kaplan. "Potentially, this is something that anyone who does public safety or security would want."

Source : www.computerreview.com

The Battle of the Browsers – The History and the Future of Internet Browsers

by: Nicholas C Smith

With Internet Explorer 8 now available, can Microsoft hope to retain market dominance over fierce open source rivals such as Mozilla's Firefox or the feature packed Opera web browser. Can history give us a clue to what the future of web browsers/browsing might hold? How did Netscape Navigator go from having a dominant 89.36% market share of all web browsers in 1996 and yet only 3.76% by mid 1999?

Let us take a journey that will begin long before even the intellectual conception of Internet Explorer, that will glance at its long defeated rivals, examine the current browsers available and will end with a prediction of what the future of browsing will offer us – and which browser(s) will still be around to offer it.

People often think that Internet Explorer has been the dominant web browser since the golden age of the internet began. Well for a very long time now it has indeed been the most popular browser and at times been almost totally unrivalled. This was mainly a result of it being packaged free with Microsoft Windows, in what some would later call a brutal monopolisation attempt by Microsoft. The last few years however have heralded the arrival of new, possibly superior browsers. Mozilla's Firefox has been particularly successful at chipping away at Explorers market dominance. So where did it all begin, and why were Microsoft ever allowed to have a hundred percent market dominance?

Origins

The truth is they never did have total dominance, but at times they have come very close. Microsoft actually entered the Browser Battle quite late on. Infact a man named Neil Larson is credited to be one of the originators of internet browsers, when in 1977 he created a program – The TRS-80 - that allowed browsing between “sites” via hypertext jumps. This was a DOS program and the basis of much to come. Slowly other browsers powered by DOS and inspired by the TRS 80 were developed. Unfortunately they were often constricted by the limitations of the still fairly young internet itself.

In 1988, Peter Scott and Earle Fogel created a simple, fast browser called Hytelnet, which by 1990 offered users instant logon and access to the online catalogues of over five thousand libraries around the world – an exhilarating taste of what the internet, and web browsers, would soon be able to offer.

In 1989 the original World Wide Web was born. Using a NeXTcube computer, Tim Berners-Lee created a web browser that would change how people used the internet forever. He called his browser the WorldWideWeb(http://www., which is still likely to sound familiar to internet users today. It was a windowed browser capable of displaying simple style sheet, capable of editing sites and able to download and open any file type supported by the NeXTcube.

In 1993 the first popular graphical browser was released. Its name was Mosaic and it was created by Marc Andreessen and Eric Bina. Mosaic could be run on both Unix, and very importantly, on the highly popular Microsoft Windows operating system (incidentally it could also be used on Amiga and Apple computers). It was the first browser on Windows that could display graphics/pictures on a page where there was also textual content. It is often cited as being responsible for triggering the internet boom due to it making the internet bearable for the masses. (It should be noted that the web browser Cello was the first browser to be used on Windows – but it was non graphical and made very little impact compared to Mosaic).

The Browser Wars - Netscape Navigator versus Internet Explorer

Mosaic's decline began almost as soon as Netscape Navigator was released (1994). Netscape Navigator was a browser created by Marc Andreessen, one of the men behind Mosaic and co-founder of Netscape Communications Corporation. Netscape was unrivalled in terms of features and usability at the time. For example, one major change from previous browsers was that it allowed surfers to see parts of a website before the whole site was downloaded. This meant that people did not have to wait for minutes simply to see if the site they were loading was the actual one the were after, whilst also allowing them to read information on the site as the rest of it downloaded. By 1996 Netscape had almost 90% market dominance, as shown below.

Market Share Comparisons of Netscape Navigator and Internet Explorer from 1996 to 1998

....................Netscape.......IE
October 1998..........64%.........32.2%
April 1998............70%.........22.7%
October 1997..........59.67%......15.13%
April 1997............81.13%......12.13%
October 1996..........80.45%......12.18%
April 1996............89.36%.......3.76%

In these two years Netscape clearly dominated the internet browser market, but a new browser named Internet Explorer was quickly gaining ground on it.

Microsoft released their own browser (ironically based on the earlier Mosaic browser which was created by one of the men now running Netscape), clearly worried about Netscape's dominance. It was not so much the worry that it would have a 100% market share of internet browsers on their Windows operating system, but more the worry that browsers would soon be capable of running all types programs on them. That would mean foregoing the need for an actual operating system, or at the most only a very basic one would be needed. This in turn would mean Netscape would soon be able to dictate terms to Microsoft, and Microsoft were not going to let that happen easily. Thus in August 1995, Internet Explorer was released.

By 1999 Internet explorer had captured an 89.03% market share, whilst Netscape was down to 10.47%. How could Internet Explorer make this much ground in just two years? Well this was down to two things really. The first, and by far the most important was that Microsoft bundled Internet Explorer in with every new copy of Windows, and as Windows was used by about 90% of the computer using population it clearly gave them a huge advantage. Internet Explorer had one other ace it held over Netscape – it was much better. Netscape Navigator was stagnant and had been for some time. The only new features it ever seemed to introduce were often perceived by the public as beneficial for Netscape's parent company rather than Netscape's user base. (i.e., features that would help it monopolise the market). Explorer, on the other hand, was given much attention by Microsoft. Regular updates and excellent usability plus a hundred million dollar investment would prove too much for Netscape Explorer.

2000 – 2005

These years were fairly quiet in the Battle of the Browsers. It seemed as if Internet Explorer had won the war and that nobody could even hope to compete with it. In 2002/2003 it had attained about 95% of the market share – about the time of IE 5/6. With over 1000 people working on it and millions of dollars being poured in, few people had the resources to compete. Then again, who wanted to compete? It was clearly a volatile market, and besides that everybody was content with Internet Explorer. Or were they? Some people saw faults with IE – security issues, incompatibility issues or simply bad programming. Not only that, it was being shoved down peoples throats. There was almost no competition to keep it in line or to turn to as an alternative. Something had to change. The only people with the ability and the power to compete with Microsoft took matters into their own hands.

Netscape was now supported by AOL. A few years prior, just after they had lost the Browser Wars to Microsoft, they had released the coding for Netscape into the public domain. This meant anybody could develop their own browser using the Netscape skeleton. And people did. Epiphany, Galeon and Camino, amongst others, were born out of Netscape's ashes. However the two most popular newcomers were called Mozilla and Firefox.

Mozilla was originally an open sourced project aimed to improve the Netscape browser. Eventually it was released as Netscape Navigator 7 and then 8. Later it was released as Mozilla 1.0.

Mozilla was almost an early version on another open source browser, Firefox. With it being an open source the public were able to contribute to it - adding in what features it needed, the programming it required and the support it deserved. The problems people saw in Internet Explorer were being fixed by members of the open sourced browser community via Firefox. For instance, the many security issues IE 6 had were almost entirely fixed in the very first release of Firefox. Microsoft had another fight on their hands.

2005 – Present

Firefox was the browser that grew and grew in these years. Every year capturing an even larger market share percentage than before. More user friendly than most of its rivals along with high security levels and arguably more intelligent programming helped its popularity. With such a large programming community behind it, updates have always been regular and add on programs/features are often released. It prides itself on being the peoples browser. It currently has a 28.38% market share.

Apple computers have had their own browser since the mid 1990's – Safari - complete with its own problems, such as (until recently) the inability to run Java scripts. However most Apple users seemed happy with it and a version capable of running on Windows has been released. It has had no major competitor on Apple Macs, and as such has largely been out of the Browser Wars. It currently holds a 2.54% market share and is slowly increasing.

Internet Explorer's market share has dropped from over 90% to around 75%, and is falling. It will be interesting to see what Microsoft will attempt to regain such a high market share.

Opera currently holds 1.07%.

Mozilla itself only has a 0.6% market share these days.

The Future of Web Browsing

Web browsers come and go. It is the nature of technology (if such a term can be used), to supplant inferior software in very short periods of time. It is almost impossible for a single company to stay ahead of the competition for long. Microsoft have the advantage of being able to release IE with any Windows using PC. That covers over 90% of the market. They also have the advantage of unprecedented resources. They can compete how they wish for as long as they wish. So there is no counting IE out of the future of web browsing.

Safari is in a similar position, being easily the most popular Mac web browser. Its long term survival is dependant upon Apple and the sale of their computers.

These are the only two browsers that are almost guaranteed another five years of life, at least. Firefox may seem like another candidate, but the public is fickle, and one bad release, or if it seriously lags behind the new Internet Explorer 8 for long, could easily see its popularity quickly descend into virtual oblivion.

However, it seems likely community driven browsers, such as Mozilla and Firefox, will be the only types of browser capable of competing with the wealthy internet arm of Microsoft in the near future.

As for web browsing itself, will it change any time soon? Well it already has for some online communities. For example, if you want to buy clothes you could try entering an online 'world' creating an online virtual You to go from 'shop to shop' with, looking at products and trying/buying what you see. Some 'worlds' allow you to recreate yourself accurately including weight and height and then try on things apparel such as jeans to give you an idea of how you would look in that particular item.

Will 'worlds' like this destroy normal web browsers such as IE ? - It seems unlikely. Traditional web browsers provide such freedom and ease of access that it is hard to see any other alternative taking over. However they are part of the new, 'thinking out of the box' wave of alternatives that some people will find attractive, and really who knows what the future will bring.

Thursday, September 4, 2008

Google Chrome

Google chrome is a free and open source web browser developed by Google.



Chrome's significant features :


Separate >> its ability to run Web applications separately, in different windows or tabs, just as an operating system can run applications as individual "processes." "If one tab dies, you don't lose the others or the browser itself," explains Darin Fisher, who led the project.



Security >> Usually, when hackers try to install malware on a computer via the browser, they look for bugs in a component called the rendering engine. Chrome runs separate rendering engines and segregates each one with another layer of protection. "It's an extra level of security," says Fisher. This means a hacker would need to find not only a bug in the rendering engine but one in the protection layer in order for the malware to make its way out of the browser and into a computer.



Special tabs >>

Chrome puts the tab buttons on the upper side of the window, not below the address bar as traditional tabs.


Omnibox >>



Omnibox, which will replace the address bar and search bar in the Google Chrome browser, will offer search suggestions, popular pages and history pages. Omnibox will also automatically replicate a webpage’s own search box, allowing site and query strings to be entered simultaneously: for instance, entering “amazon”, pressing tab and then the search term will automatically go to an Amazon search results page for that term.


Default page >> This page you will see your most visited webpages as 9 screenshot thumbnails.


“incognito” >> incognito feature is a privacy mode which no record of surfing will be kept. Malware and phishing will be protected against, with the Google Chrome browser automatically downloading a constantly updated list of harmful sites in the background.


launch >> Web application can be launched in their own browser window without address bar and toolbar.


[Images by Google.]

Wednesday, September 3, 2008

Recomended Anti Spyware Software

Counter Spy

Counter Spy by Sunbelt Technologies offers almost stealth-like performance on your system. CNET gave this program the Editors Choice Award for March 2008. That is a very good review and not an easy award to achieve.






Spy Sweeper

Spy Sweeper by Webroot is one of the best anti-spyware programs you can buy. Having been awarded over 45 different awards, you can be sure that this program is highly rated and performs as advertised. Top Ten Reviews has also given Spy Sweeper a great review. It has received the Gold Award. Rated very highly for effectiveness, ease of use, feature set, ability to customize, customer service and ease of set up. Definitely worth the money is what they state.


Spyware Doctor

Spyware Doctor was awarded PC Magazines editors choice in 2007. It comes highly rated as one of the best anti-spyware programs on the market. With rave reviews such as these it requires some discussion as to why you need this software and how it can protect you computer from malicious code.




Trend Micro
Trend Micro Anti Virus Plus Anti spyware offers a user many new features and improvements over previous versions. It comes with an average rating from CNET, mostly due to a lack of independent testing. If you are looking into replacing, upgrading or installing an anti-spyware program, this software should be given consideration.





Ad-Aware by Lavasoft

Ad-Aware by Lavasoft is one of the oldest ant-spyware programs available. This program has always had a free version and Ad-Aware 2007 is no exception. If you are looking for a basic and effective anti-spyware for your home computer or network, Lavasoft’s Ad-Aware is for you.





McAfee Anti Spyware

MacAfee anti-spyware has received a 7.6 out of 10 rating from CNET. That gives it a very good rating. Adventages of MacAfee anti-spyware is its ability to easily interface with other McAfee products and it constantly monitors your system for spyware attacks.






Omniquad Anti Spyware

Omniquad Anti Spyware by a company based in London UK. Omniquad AntiSpy offers are the ability to scan for over 68000 threats, scans quickly – just 5 minutes for the average hard drive, automatic updates, and a user-friendly interface. Furthermore, it scans memory and your registry continuously for threats. You can rest assured with Omniquad AntiSpy that your system will always be protected, as there are automatic updates.

Pest Patrol Anti Spyware

CNET reviewed it and gave it an excellent rating, 8.0 out of 10 and the users gave it an 8.4 out of 10. This was on 5/14/2004. CA PestPatrol 5 is ranked at number 5 for anti-spyware in 2008 on Top Ten Reviews website.







Spyware Be Gone

Spyware Be Gone has won awards for its effective removal and blockage of spyware, adware, malware, and browser hijacking. Spyware Be Gone from Adware is a program that will offer you protection from browser hijackings in IE. This program is its ability to minimize to the system tray while a scan is in process. There is a quick scan and full scan option.




Maxion Spy Killer Software
This program will remove threats to your system such as spyware, malware, hijackers, key loggers, adware, and pop-up windows. Through advanced algorithms, Spy Killer will prevent these threats from installing as well as detect and remove any threats currently residing on your system.








NoAdware Software
NoAdware.net is the website for NoAdware. This is a software program designed to remove adware from your computer. According to their website, if you download music, videos, have a slow running PC, are plagued by pop-ups, or if your browser home page keeps changing, you are most likely suffering from an adware attack. NoAdware offers a free scan for your computer and a free download of their software.




Spy Hunter Anti Spyware
SpyHunter by Enigma Software Group is an anti-spy software program that will remove trojans, adware and spyware from your computer. one great feature about SpyHunter is that it is free of charge. You can download it, scan your computer and keep you system safe, free of charge.







Xoftspy Anti Spyware

XoftSpySE anti-spyware is a program by Pareto Logic. This anti-spyware software is designed to scan your computer for spyware and other “parasites” so that your computer stays safe from these threats. You can view their online video at paretologic.com/products/xoftspyse. XoftSpySE has won several awards. Softpedia Editors Review rated it as very good with 4 out of 5 stars. The editors at Softpedia were impressed with the ease of use and the impressive number of anti-spyware definitions the program contained.