Entries tagged as search enginesRelated tags advies 301 redirect 302 redirect google google update googlebot inhoud ondernemen panda ranking richtlijnen seo webbouw webdesign zoekrobot zoekwoorden copyright google images image search css javascript matt cutts adsense belastingontduiking content diefstal free sites hacker news hoorzitting info kentucky linkbuilding links monopolie multinationals negative seo nonsense optimaliseren penguin publishers rankings resultaten seo guru shopping spam update webpronews websites keywords techniek geolocation language location backlinks duplicate content hosting ip reciprocal links tricks wederkerige linksThursday, April 18. 2013Googles new update smells funny
After having been on top of search results for years now, all of a sudden things changed a few months ago.
For a large site we have been optimising with good content, that was unique and liked by visitors, we saw a huge drop after the update that appearantly preferes new sites, with generic domain names. Sites with almost no content outrank old sites just because of the domain name. We had good hopes things would settel after a few weeks, updates always shake things up. But nothing happend, nothing other than Google giving small companies another blow: the new google merchant popped up. Finding these changes rather disturbing i wrote Matt and he answered with his standard text, make your site better, more content:
Those so called trophy phrases dear Matt are the keywords that bring traffic and customers to the site! If a site with mirrors doesnt rank on that keyword any more (top 3) then sales will drop! Not very happy with the standard answer i wrote Matt an answer:
Posted by Tonnie Lubbers
in SEO
at
15:00
| Comments (0)
| Trackbacks (0)
Defined tags for this entry: google update, image search, keywords, matt cutts, monopolie, ranking, search engines
Wednesday, January 30. 2013Google image search changes and copyright issues
Is Google still the search engine or is it leading towards being a content king?
Google is more and more working towards a portal model in which it shows you the content you searched for, rather than delivering you the links to websites it found the content on. The shift towards 'being completely in control over content' is noticeable. Despite the major point Google made of talking about how good it was going to be for webmasters, the latest changes in Googles image search (at the moment only available on google.com) made many webmasters angry and frustrated. Due to the new presentation webmasters complained about “significant” decreases in traffic. Some even report losses of trafic up to 60% on the Google Webmaster Central blog. What changes did Google makeGoogle image search changed its image search algorithm that makes it harder to stumble across adult pictures, whether or not you're searching for them.
The move is designed to ensure adult content is shown only to those who explicitly request it. In other words you will have to be specific to what you want to see or, put the word porn or sex in your query. But what concerns users and webmasters more is the fact that: Google uses large images without consentWithout consent of owners, Google now shows large images that were taken from websites and placed on their own servers. Clicking on a result directly leads to the larger image, a visit to the web site it came from, no longer is necessary. People searching for certain 'product' images will now no longer visit the site to see the larger one, and therefor wont see the site and any other products and or advertisements on that site. Google knows it will probably get away with it, not just for the fact that a large number of small companies depend on online sales they can get from product images searched for, or revenues from ads, and will hesitate to block the bot, but also because of its mighty law department. Remember the ongoing dispute on Google copying books? For sure, they know that what they are doing is wrong. The statement that "images may be subject to copyright" is nothing more than a smoke curtain. Under the US Copyright Act, all images are copyrighted automatically. Small thumbnails = fair use but showing large or original images on any other web site than the original without consent, is a copyright violation. Googles defence against all the accusations is a simple 'then go and block the bot'. In other words, i will come to your house and steal your possessions, but i wont succeed if the door is locked. The next stepThe next step Google will make is showing ads around the results and then leaving webmasters that spend a lot of energy in building good websites empty handed. How this will end is uncertain but for sure it stired up a lot of emotions as we can read on Google Webmaster Central blog. Some quotes from concerned people:
Posted by Tonnie Lubbers
in Algemeen
at
15:37
| Comments (0)
| Trackbacks (0)
Defined tags for this entry: copyright, google, google images, google update, image search, search engines
Google image search and my letter to Matt
Hi Matt,
For years now i have worked hard to put up a good website, got bumped time and time again by people stealing my work. And now my content is being stolen by Google! You at Google are displaying large photos of my site without permission to do so! Many photos are mine but there are also pictures you took that i paid for and am not allowed to redistribute. We fought spammers and scrapers for years and now Google turns out to be the worst one. Is this the way you want the web to be? Google using other peoples content without the owners having a benefit from it? You guys copied books, you came up with a new shop that killed most of my traffic and therefor customers and now you plain and simple steal my properties? Google once had a motto "Don't be evil" Well it turns out YOU ARE THE EVIL!!!! Not so sincerely anymore, Tonnie Lubbers Note: i send this letter to Matt Cuts Googles spam cop and will post a larger article about this subject on this weblog soon
Posted by Tonnie Lubbers
at
13:47
| Comments (0)
| Trackbacks (0)
Defined tags for this entry: google, google images, google update, image search, matt cutts, search engines
Wednesday, July 25. 2012Google guidelines voor goede en betrouwbare artikelen - gevonden worden in Google
Google voert regelmatig updates door op haar wijze van indexeren van pagina's. De veel besproken Panda update heeft voor nogal wat verwarring en onrust gezorgd onder uitgevers van websites en website bouwers. Veel paginas die door de update zakten in de zoekresultaten en vooral veel discussie over wat nu wel en wat niet belangrijk is voor het maken van een goede website.
Op een officieel kanaal van Google: guidance on high quality geeft een van haar medewerkers Amit Singhal richtlijnen waaraan sites zouden moeten voldoen cq. vragen waaraan je kwalitatief goede sites zou kunnen herkennen. Sommige richtlijnen zijn duidelijk, andere daarentegen zijn verdedigbaar te weerleggen of er zijn vraagtekens bij te stellen. Per vraag heb ik mijn opmerkingen, gedachten daarover, toegevoegd. Bij de vertaling ben ik zo dicht mogelijk bij de intentie van de vraag gebleven, maar ook ik ben menselijk en kan daarin in een enkel geval voorbijgeschoten zijn. Meldt mij aub. uw opmerkingen. B.v.d. Google's guidelines voor goede kwaliteit vertaald in het Nederlands:
Conclusie:Google worstelt hier blijkbaar zelf mee en dat is zoals al aangegeven, niet verwonderlijk. Het achteraf toekennen van autoriteit, kwaliteit en originaliteit is een bijzonder lastige opgave. Voor auteurs, uitgevers en website eigenaren blijft er slechts 1 richtlijn over die stand houdt: Schrijf eigen artikelen met veel passie en voor uw lezer.
Posted by Tonnie Lubbers
in SEO
at
13:34
| Comments (0)
| Trackbacks (0)
Defined tags for this entry: advies, google, google update, inhoud, panda, ranking, richtlijnen, search engines, seo, zoekwoorden
Thursday, March 29. 2012Matt Cutts - Laat Googlebot uw Javascript en CSS doorzoeken
Normaal gesproken geeft Matt Cutts (SEO woordvoerder van Google) tips en vrij algemene uitleg over hoe Google het web doorzoekt en wat u wel of niet zou moeten doen. Nu komt Matt met een publieke oproep om de Google bot uw javascripts en css-files te laten doorzoeken. Dit om 'beter te kunnen begrijpen waar uw site over gaat'.
Veelal worden Javascript files / directories en soms ook die van de CSS in de robots.txt voor zoekrobots als 'verboden gebied' opgegeven. De zoekrobot mag deze bestanden niet doorzoeken en opnemen. Eigenlijk een vrij logische keuze, want wat heeft een zoekrobot immers met de techniek achter de site van doen!? Een van de door Matt gebruikte argumenten om de googlebot wel deze files te laten doorzoeken is het voorbeeld van middels Javascript geïnclude informatie. Doordat Javascript niet doorzocht kan worden, kan Google dus ook niet aan de zo opgeroepen informatie komen. Gaat u Google bot toestaan om deze files te doorzoeken? Of wellicht heeft u middels Javascript of CSS andere informatie aan de zoekrobot gevoerd die uw bezoeker niet te zien krijgt!?
(Page 1 of 1, totaling 5 entries)
|
QuicksearchCategoriesBlog AdministrationShow tagged entries |