Basically, it's a "best practices" kind of thing; you are telling Wikipedia community about your research, and in exchange, you may get some feedback from the few volunteers (often researchers themselves) monitoring those pages. Nothing more, nothing less, really.
This isn't quite right. For the last 3.5 years, new research which has the potential to disrupt Wikipedian activities (surveys, interviews and experiments) has been documented and discussed via a light-weight process involving describing the project on meta. This process of documentation and discussion is a means to public consent and I have yet to see a study that goes through that process fail to run successfully. While this process was recently scrutinized by a small group including Piotr, it's certainly not merely a "best practice"; people expect it to work like policy. We've had several researchers attempt to run surveys of English Wikipedia only to be stopped and told to follow this process on Meta before continuing. If you contact me or another researcher at the Wikimedia Foundation, we will help you negotiate this process. (Piotr, if you would like to reignite this discussion, I suggest we take it to a new thread.)
However, it doesn't sound like this is what Xiangju is asking about. It sounds more like he is asking about documenting a study after-the-fact. Here, I think that meta has the potential to help you have a "broader impact" (jargon for affecting something other than your citation count). By listing your results on Meta, you enable Wikipedians to more easily take advantage of your work. You might even find that your citation rate goes up too since there are a lot of us academics working on wiki stuff who track and discuss research on Meta. I've actually had a few citations to reports I have authored primarily on Meta. Regretfully, those don't "count" yet. I imagine it would have been different if I had a DOI.
-Aaron