A while back, I can’t remember where, I saw a study that showed that google and other search engines were still using the keywords meta, but only if you met specific qualifications. (I think it was on SEOWorkers, one of the few out there in the field that doesn’t reek of “scam”)…
It was something like 8 or 9 keywords only, less than 128 characters total, with zero redundancy and 100% relevance to the text on the page.
Which makes sense from an efficiency standpoint – EVERY word on a page can’t be a keyword… a “phrase” is not a keyword… and if a “keyword” isn’t present on the page – it’s probably NOT relevant to the page. Setting cutoff limits to how/why/when it’s relevant would make the people who blow a full 1k on a couple hundred words think it doesn’t work anymore, when it’s their ridiculous keyword stuffing that’s shtupping them.
In general I include them on my pages, using the above and treating it like a word jumble – again it’s called keywords, so using phrases doesn’t make any sense. I’ll often see things like:
content=“web development, web servers, server development, html programming, web programming, server programming”
on and on and on for a k or 2 of code… when
content=“web,development,server,programming,html”
works JUST as good, if not better. keyWORDS… It’s been my experience there’s no need to restate the same words in phrase after phrase, no need to state plurals as the engines seem smart enough to pick up on that – keyWORDS… phrase stuffing is one of the things that made search engines devalue them in the first place!
… and if those words do not appear in the body text of your document, they have no business being in your keywords meta! – aka zero relevance.
If you use SEOWorkers tool, it will tell you much the same thing:
Free SEO Analysis Search Engine Optimization Tool - SEO Workers
It kvetches when your metas are uselessly long and don’t have relevance to the page.