I would not oppose some means of allowing authors to add META tags that were honest and accurate and based on human judgment. But if we created META tags by some automated process just to get higher rankings on search engines that still use them, then we would be guilty of manipulation just as other advertisers are.
I'm not sure there's anything inherently dishonest about creating meta keyword tags from linked words. Those keywords are real keywords for the article -- it seems like a pretty good proxy for what humans would enter into a separate field anyway, and yet it doesn't cost us any human labor.
I guess my question is: what's the downside? What's the harm? Generally being guilty of participating in something that other people manipulate unfairly doesn't strike me as a real downside.
Fair enough; I agree there's little downside, especially if we did as Magnus suggests and send them only to anonymous browsers. But to me, the fact that article A links to article B doesn't actually tell us anything, and what it does tell us is already encoded in the mere fact that the link exists. Putting in extra META tags is merely repeating the same information in a different place. So we aren't "adding keywords" at all--we're just making longet HTML for the same information content.
I'd like to see some means of actually making more meaningful content. If an author were able, say, to see a list of links and choose, say, the top three or four, then /that/ would really be useful information. Then it's not just the fact that A links to B (which could be completely irrelevant, since we link everything here), it records the fact that some person thought that the link from A to B was important.