640,000 Websites Estimated to be Infected with Malware in Q3

Dasient, which monitors website for malware, reported that an estimated 640.000 websites and 5.8 million web pages were infected with malware in the third quarter of 2009. A significant portion of those infected websites, 39.6%, were reinfected during quarter. Websites can become reinfected if the vulnerability that allowed the website to be hacked into is not secured or another vulnerability is discovered. Most infection code consisted of  JavaScript (54.8%) or an iframe (37.1%), with other code, such as .htaccess redirects, accounting for 8.1%.

Gumblar Malware Code Replaced With Iframe Neutralizer

The Gumblar malware, which returned in the past several weeks, appears to be neutralized for the moment. In its return, Gumblar was using compromised websites to host its malware code instead of a website owned by the person(s) behind the hack. Other websites that have been compromised by Gumblar, then have code inserted into them that causes a file, with the malware code, to be loaded from one the websites that host the malware.

The code on those websites hosting the malware has now been changed from the malware infection code to Javascript that neutralizes iframes and a message that reads “iframes are EVIL! Hate Zeus!”. If the iframe neutralizing code is loaded on a website that contains other malware scripts, which occurs in some cases, it could possibly disable those scripts.

Gumblar inserted backdoor scripts as part of its hack, which someone other than the original hacker could have used to change the code stored on the host websites. It is also possible that the originally hacker made the change for some unknown reason.

Google Announces That It Will Add Twitter Posts to Search Results

Just a few hours after Microsoft announced that it would be adding Twitter tweets to its search results, Google has announced that they will be doing the same. Unlike Microsoft, which released a beta version of their Twitter search along with their announcement, Google will introduce a product that “showcases how tweets can make search better” in the coming months. The move by both search engines is part of their strategies to  integrate more real-time information into their search results.

Bing Adding Twitter and Facebook Posts to Search Results

Microsoft announced today that they would be integrating Twitter tweets and Facebook status updates into Bing’s search results. Tweets can be searched from the beta of Bing Twitter search at http://www.bing.com/twitter. Microsoft made no mention of whether or when tweets will be integrated in the standard search results. In July, Bing added tweets from a limited group of high profile individuals to some search queries related to those individuals. Facebook status updates are to be integrated into search results at an unspecified later date. There have rumors that Google has been in discussions with Twitter and Fackbook integrating postings from their services into Google’s search.

WordPress 2.8.5 Improves Security

WordPress 2.8.5 was released yesterday, which includes a fix for a denial-of-service (DoS) attack and a number of changes that removed code that could potentially be used to hack into WordPress. The denial-of-service attack utilizes specially crafted trackbacks that cause WordPress to use a significant amount of processing power when they are processed which could lead WordPress becoming unresponsive.  The code removal changes were originally developed for the upcoming version 2.9 and were backported to improve security as soon as possible.

Mobile Searches on Google Grew 30 Percent in Q3

During Google’s third quarter earnings calls Google’s CEO Eric Schmidt reported that Google had a 30 percent quarter-over-quarter increase in mobile searches during the quarter. Jonathan Rosenberg, Google’s Senior Vice President of Product Management, said that smart phone adoption was a driver of growth and that mobile search has grown twice as fast as desktop search during the last year in Japan and some other markets. Schmidt also reported that Google made over 120 “search quality improvements” during the quarter.

Google Adds Crawler View and Malware Details to Webmaster Tools

Google has added two tools to their Webmaster Tools service under the “Labs” label.

The Fetch as Googlebot tool shows the contents of the page and HTTP response headers that Google receives for a specified page. Being able to see exactly what Google sees, when it requests a page, can be helpful in diagnosing problems that can cause Google to have trouble crawling and indexing websites. It can also help in diagnosing hacks that modify the content of pages when Google requests them.

When Google has detected that a website is infected with malware, the new Malware details tool provides samples of the malicious code that Google has detected. The types of code that Google will provide samples of include “injected HTML tags, JavaScript, or embedded Flash files” and the tool will also identify if URLs from the website are being redirected to another website that contains malware.

Google uses the “Labs” label to identify features that are still in development and that “may break at times.”

Google Expands Search Results Filtering Tool

Google has announced that they have added a number of new features to Search Options, their search results filtering tool included in the search results page. The tool already provided a number of options for filtering results based on the when the pages were created and now can filter results from the past hour and from a specified date range. Results can now also be filtered to include more or fewer shopping sites. Users who are signed into a Google Account and Web History enabled can have results filtered to include or exclude pages they have already visited. The Search Options tool was released in May.

Google Adds Links to Sections of Web Pages In Search Results

Google has announced that they have begun to include links to section of web pages in their search results. The new links come in two forms, the first form provides a set links to section of the web page relevant to the search query and the second form provides a link to jump to  specific section of the page relevant to the search query. When these links are shown is determined algorithmically, but Google has provided information on what will provides the best chance for these links being shown. Anchor are needed in the web page so that sections of the page can be linked to, the anchors should use descriptive names, and there should be a “table of contents” which links to the anchors.

Google Introduces Parameter Handling to Webmaster Tools

Google has added the ability for webmaster to instruct Google to ignore URL parameters in Webmaster Tools. URL parameters are name/value pairs appended to end of URLs (example: http://www.example.com?sessionid=1232132). The problem with URL parameters is that each URL created by a variation of parameter may or may not change the contents of a web page, so search engines treat each version as a separate web page. When they are actually the same it can have as serious negative impact on the indexing and ranking of the pages in search engines. In general it is better not use URL parameters in these situations, but the new feature should help to better handle situation where they do exist. In help documentation for the new feature, Google states that up to 15 parameters can be set to be ignored and that they will treat the “requests as suggestions rather than directives.” Parameter handling can be found in the Settings section of the Webmaster Tools. Yahoo has provided the ability to set parameters to ignore, as well as well allowing a default value to be set for parameters, for some time.