4. The Semantic Web
The “Semantic Web” (also known by some as Web 3.0) is all about making the internet more easily understood by machines, effectively taking the guess work out of understanding HTML web content and ushering in a new era of storing and retrieving information on the internet. Essentially it is the progression of the structure of what we know as the web; moving towards flexible and searchable databases of information, rather than the more static HTML web pages we are used to.
Over the last few years industry leaders have been hard at work on this, and we’re now beginning to see the fruits of their work in the public forum. All the main search engines and social networks are pushing hard for website owners to get their sites compatible with this new technology as they believe this is the future for the web.
A good example of the Semantic web is Apple’s Siri tool. When you use the voice recognition tool to give you some information, it doesn’t deliver you a website but rather provides you with information directly to your phone. This information has been uncovered from a huge range of databases that the Siri tool queries when seeking answers. Google is doing the same – the search results pages are becoming more and more /wp-content/uploads/filled with information-based results, rather than just links to websites.
Website owners who fail to convert their websites and ensure they are indexed in semantic databases will be left out of the many new search engine enhancements rolling out this year. In our view it should be a priority for Retail and Travel Clients (who have product databases which they should be making “semantic friendly”), but all website owners should be considering the impact of this over the next 2 years.
No one knows if Google will still be the top search engine in 5 years, or if a new innovator will emerge, but everyone agrees whoever it is, they’ll be using Semantic technology.
5. The Foreseen and Unforeseen Consequences of living in a world of Not Provided Data
This trend is not really a new trend because some sites are already seeing in excess of 40% “not set traffic,” however we can’t see this issue going away as this number is only ever going to increase.
To mitigate the issue we expect to see more brands start pre-qualifying their SEO strategy by using tactical PPC to understand what keyword clusters convert well. Secondly we expect to see more intelligent analysis of landing pages, especially using features in Google Analytics such as Multi Channel Funnels and Conversion Segments. This will help the marketer understand the role each area of the site has in relation to the conversion funnel and identify better success metrics.
An unforeseen consequence of all of this has been an increase in demand for more ranking data to validate strategy. Webmaster tools search query data is often inadequate for enterprise sites with only 2,000 records being shown. There is also a real difficulty in comparing data between Google Analytics and Webmaster Tools. For example GA has no out the box features to compare web search to mobile search.
Worryingly Google has already fired the first warning shot to companies that scrape their data. We really think a cat and mouse situation is going to develop as data providers try and stay ahead of Google as they scrape more data which clients are requesting.
This post was contributed to by Mike Sharp, Kathryn Jefferies, Tim Hooper and Jonathan Moore.