Thanks for the info about double-counting
That is a good point that people have the alternative of reading about
Lila in article space. I can't think of a better explanation.
Pine
> Date: Sat, 10 May 2014 16:34:56 -0700
> From: Oliver Keyes <okeyes(a)wikimedia.org>
> To: "A mailing list for the Analytics Team at WMF and everybody who
> has an interest in Wikipedia and analytics."
> <analytics(a)lists.wikimedia.org>
> Subject: Re: [Analytics] stats.grok.se questions
> Message-ID:
> <CAAUQgdBvbM-xUoiO19NUjVAB3rXaFRM-Y8A-1fvYX7ZCAGEr9g(a)mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> It should not double-count. Scenario: you go to [[Foo]], a redirect to
> [[Bar]]. A request is logged for http://en.wikipedia.org/wiki/Foo, and
> MediaWiki internally examines Foo, works out it's a redirect to Bar, and
> provides the content of Bar instead. None of that happens at a level where
> the requestlogs directly catch it - MediaWiki doesn't send back a message
> going "uh, I think you want Bar", necessitating a second request, it just
> provides Bar's content.
>
> In regards to Lila's page: maybe this is the first time people were pointed
> to her user page ;). I have no doubt that people are interested in finding
> out about her, but I imagine the reaction of most individuals would be to
> go to the article about her rather than her user page.
>
>
> On 10 May 2014 02:14, Alex Druk <alex.druk(a)gmail.com> wrote:
>
> > Hi there,
> >
> > I have no idea about Lila, but answer to your question about redirect and
> > articles is NO (at least in my opinion).
> >
> > If it would be double counting, number of hits on articles would be always
> > greater than on their redirects. It is not so. You can compare for example
> > page views on "Ghoramara" (redirect to Ghoramara Island) 24824 pageviews
> > in May according to stats.grok.se and "Ghoramara Island" (actual article)
> > - 68 pageviews.
> >
> > So, no double counting on stats.grok.se. It seems to me, that when user
> > used redirect the hit is listed in raw file, but actual article - not.
> >
> > Alex
> >
> >
> >
> > On Sat, May 10, 2014 at 9:48 AM, ENWP Pine <deyntestiss(a)hotmail.com>wrote:
> >
> >> Following up on
> >> http://lists.wikimedia.org/pipermail/analytics/2014-April/001898.html
> >>
> >> Would including the redirect hits with the endpoint hits result in a
> >> user's single request for content being double-counted for the page where
> >> they eventually land?
> >>
> >> Here's another question about pageview statistics. On May 2 and May 3
> >> there are spikes in the statistics for many users' user pages on English
> >> Wikipedia, but en:User:LilaTretikov has surprisingly few page hits. Does
> >> anyone have an explanation for how Risker is getting a lot more userpage
> >> views than the new executive director is on English Wikipedia? I have a
> >> hard time believing that more people and bots are curious about the user
> >> pages of Risker and lots of other Wikipedians than they are about the user
> >> page of Lila in the past few weeks, so I am wondering if there is some
> >> error in the pageview statistics or if there are a very large number of
> >> bots that are generating page views by linking to the user pages of editors
> >> who edit certain pages.
> >>
> >> Pine
> >>
> >> _______________________________________________
> >> Analytics mailing list
> >> Analytics(a)lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/analytics
> >>
> >
> >
> >
> > --
> > Thank you.
> >
> > Alex Druk
> > alex.druk(a)gmail.com
> > (775) 237-8550 Google voice
> >
> > _______________________________________________
> > Analytics mailing list
> > Analytics(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/analytics
> >
> >
>
>
> --
> Oliver Keyes
> Research Analyst
> Wikimedia Foundation
>
Following up on http://lists.wikimedia.org/pipermail/analytics/2014-April/001898.html
Would including the redirect hits with the endpoint hits result in a user's single request for content being double-counted for the page where they eventually land?
Here's another question about pageview statistics. On May 2 and May 3 there are spikes in the statistics for many users' user pages on English Wikipedia, but en:User:LilaTretikov has surprisingly few page hits. Does anyone have an explanation for how Risker is getting a lot more userpage views than the new executive director is on English Wikipedia? I have a hard time believing that more people and bots are curious about the user pages of Risker and lots of other Wikipedians than they are about the user page of Lila in the past few weeks, so I am wondering if there is some error in the pageview statistics or if there are a very large number of bots that are generating page views by linking to the user pages of editors who edit certain pages.
Pine
Hi everyone,
At the moment the Wikimedia Foundation collects page view statistics,
but no image view statistics. This means we don't know how many views
our images and other media files get, we try to derive it based on page
views. A long time ago someone enabled logging of image views for NARA
images. The data has been collecting for several years, but nothing has
been done with the data yet. We worked on it this weekend at the
hackathon. First results at
https://commons.wikimedia.org/wiki/Commons:GLAMwiki_Toolset_Project/NARA_an…
Maarten
Dear All,
My name is Dorothy Howard and I'm a Wikipedian-in-Residence at a library
consortium called
METRO<https://en.wikipedia.org/wiki/Wikipedia:GLAM/Metropolitan_New_York_Library_…>in
New York.
I'm reaching out because I have a friend and colleague from a non-Wikipedia
job that is interested specifically in qualitative data visualization and
is a researcher at the Parsons Institute for Information Mapping at the New
School.
After attending the Chapters Dialogue conference in Berlin and seeing the
result of the WMF sponsored chapters dialogue research, a portion of which
was qualitative- I became more interested in thinking about how we can use
qualitative (not exclusively- also interested in quantitative) data
visualization to model our movement on a national or even more local scale.
My colleague at the New School is interested in pursuing this type of
research and I am writing to inquire for someone to talk with more about
these matters and the potential to develop some steps forward. I'm
interested in projects such as
"<https://wikimania2014.wikimedia.org/wiki/Submissions/Measuring_community_he…>Measuring
community health: Vital signs for Wikimedia
projects,"<https://wikimania2014.wikimedia.org/wiki/Submissions/Measuring_community_he…>that
the Analytics team has proposed, but when it comes specifically to
GLAMs, I am interested in doing analytics about the relationships between
GLAMs and Wikipedia, and the networks of knowledge and exchange which
occur. I do not know if this type of research is already being done, but I
would appreciate links to any such related research.
Would love to talk more to someone working on or interested in similar
things.
Thanks,
Dorothy
Hi!
The speed bumps from the eventlogging migration are almost ironed out:
1. db1048 has had the eventlogging uuid fields made formally UNIQUE KEY. I
gather Ori will now run some validation against logs to check for remaining
gaps.
2. db1046 which died mid-migration has been restored and is catching up.
This doesn't really affect Analytics except that it's to be part of
db1047's replication chain for eventlogging.
3. db1047 is finishing up reloading log data and removing the CONNECT
federated tables involved in bug 64445[1].
As something of a consolation prize, "analytics-store.eqiad.wmnet" is now
open for SELECT queries from the 'research' user. This box:
- Is a CNAME for dbstore1002.eqaid.wmnet.
- Replicates all wikis in one place.
- Can be hammered. Please feel free.
- Can have scratch space for temporary writes (but doesn't yet).
- Can replicate eventlogging too (but doesn't yet).
I would appreciate if anyone has some suitable read-only reports to try
out, please do so and report back.
BR
Sean
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=64445
--
DBA @ WMF
We are changing EventLogging to write events to m2 instead of db1047. The
migration will take up to 12 hours (but probably less). Also, we may end
up with gaps in the data written to the database throughout this period.
We will reply to this thread once the migration is complete.
Samuel Klein, 30/04/2014 05:35:
> Asking 1/1000 users of tool X a single open-ended question ("please
> give us feedback on X" or "how is X working for you"?) can be a handy
> way to encourage brief input from a cross-section of users,
We do have such a feature in MediaWiki, though: mediawiki.feedback.js.
It's just a JavaScrip popup which saves the comment to a page on the wiki.
> many of
> which would not otherwise comment at all. And for some tools (such as
> UploadWizard) there is no obvious place to leave comments, and opening
> Bugzilla is a new-tab + multi-step process away.
UploadWizard (like VisualEditor) uses what above. Maybe it needs an
option to be offered more prominently under some conditions?
This is probably the most viable option here, almost no technical effort
and more value in output.
Tilman Bayer, 29/04/2014 21:58:
> might be worth revisiting
> LimeSurvey, which appears to have undergone a complete rewrite since
> that installation was removed from WMF servers for security concerns
> around 2011.
+1. It will need to be done anyway, at some point, e.g. if a general
editor survey is tried again.
> multilingual
> support than other solutions [...]
> lack of integrated language support in
> Surveymonkey, or just because the focus was on per-project results
> anyway?
Agreed on all the rest but this point specifically. It seems
surveymonkey is really out of question. However, how many languages does
Qualtrics support?
LimeSurvey says 50; it is translated on a public instance of GlotPress.
GlotPress is from Automattic and is used to make some Wordpress locale,
hence some translatewiki.net have experience with it. However I wasn't
able to gather much information about it, I only know that it's yet
another web tool for .po format; maybe Stu can put us in contact with
someone with more insight (especially on how much it's used and how
prioritary for Automattic)?
Nemo
Hi everyone,
Thanks for your thoughtful comments about using survey tools for feature development.
Personally, I find surveys invaluable for learning what users think of the features we develop. Behavioral metrics are wonderful for learning how our users use our tools, but they don’t tell us how they feel about them. Surveys help us get that information quickly from a lot of users, providing useful qualitative feedback to complement our quantitative data.
I have used online surveys successfully throughout my career, and look forward to continuing to use this tool here at Wikimedia, as another important data point to inform our product decisions. They are particularly useful for hearing from our ‘silent majority’ of users — instead of the ‘vocal minority’ that dominates our talk page discussions.
For example, we are using surveys in multiple languages now to get advance feedback about Media Viewer in multiple languages — and the early responses have helped us identify key issues, from the perspective of readers (our largest user group) — not just editors. You can review these first results here - more to come:
https://www.mediawiki.org/wiki/Multimedia/Media_Viewer/Survey
To address Mark’s original question as to whether of not to use a survey for Upload Wizard, I believe that surveys are needed for that tool as well, as many media contributors do not edit frequently, if at all, even if they are registered users. For example, research has pointed out that only a small fraction of contributors to Wiki Loves Monuments become editors, and it’s likely that many mobile uploaders may not be editors themselves. Casual contributors are different from editors in many ways, and are also likely to be uncomfortable with talk page tools.
So from my viewpoint, a simple survey popup form seems the most practical way for us to collect that feedback. We want to capture user feedback about Upload Wizard as soon as they have completed their upload, while it’s still fresh on their minds. So I would recommend a prominent invitation to leave feedback on the final step of the upload process, with the popup form opening right over that final page, rather than going to another talk page.
I am open to which survey platform should be used to serve our needs, as long as the platform works reliably, doesn’t require additional development and provides all the standard features I am used to for easily creating surveys, collecting the data, then analyzing, visualizing and sharing the results. Personally, I find that Survey Monkey provides an excellent toolkit for all these functions, which lets me do my work efficiently.
But if another open source tool exists that provides the same services with the same level of quality, I would be happy to consider it. And I have been recommending to Erik and Howie regularly that we consider developing (or adapting) a survey tool that can better integrate with our wikis. However, this is not a trivial task, if we want to match the level of functionality available from other solutions.
It certainly would be worth talking to the folks at LimeSurvey to discuss improvements to their platform, which could probably be adapted for our purposes, perhaps even on a contract basis. I would be happy to participate in this discussion, to help identify key requirements, as a regular survey customer. I should also point out that I love Dario’s 'micro-survey’ idea, a tool which I would use in a minute if it were available — so this might be a direction we might want to explore as part of this investigation.
For now, Survey Monkey offers us a practical and reliable toolkit, which I am comfortable using until we find a better solution. And to answer Tilman’s question, the reason we created separate surveys in multiple languages for the current Media Viewer campaign was so that we could easily track, visualize and share responses for each language independently. Survey Monkey does provide multiple language support (under ’Survey Options’ in the design view), though you still need to have the questions and answers translated separately.
I hope these observations are helpful. I have added a few comments below as well. Please let me know if you would like any more clarifications on why I think surveys are important to our work, and should continue to be used to learn from our users.
Thanks,
Fabrice
On Apr 30, 2014, at 12:47 PM, Andrew Gray <andrew.gray(a)dunelm.org.uk> wrote:
> On 30 April 2014 08:52, Federico Leva (Nemo) <nemowiki(a)gmail.com> wrote:
>>
>> We do have such a feature in MediaWiki, though: mediawiki.feedback.js. It's
>> just a JavaScrip popup which saves the comment to a page on the wiki.
>>
>>> many of
>>> which would not otherwise comment at all. And for some tools (such as
>>> UploadWizard) there is no obvious place to leave comments, and opening
>>> Bugzilla is a new-tab + multi-step process away.
>>
>> UploadWizard (like VisualEditor) uses what above. Maybe it needs an option
>> to be offered more prominently under some conditions?
>> This is probably the most viable option here, almost no technical effort and
>> more value in output.
>
> It's certainly simpler to implement, but anything that involves
> on-wiki recording has two main problems:
>
> * friction in saving the entry (eg edit conflicts, login required,
> user IP blocked)
> * privacy problems (comments are public and effectively attributed)
>
> This isn't much of an issue for things like "please give us feedback
> on the new fancy upload tool" - where everyone can be expected to have
> a functioning account and aware of how the wiki works, but if you're
> going to be gathering feedback on reader-focused things it breaks
> down.
>
Thanks, Andrew, for these thoughtful observations.
I agree with your views.
> --
> - Andrew Gray
> andrew.gray(a)dunelm.org.uk
On Apr 30, 2014, at 12:52 AM, Federico Leva (Nemo) <nemowiki(a)gmail.com> wrote:
> Samuel Klein, 30/04/2014 05:35:
>> Asking 1/1000 users of tool X a single open-ended question ("please
>> give us feedback on X" or "how is X working for you"?) can be a handy
>> way to encourage brief input from a cross-section of users,
>
> We do have such a feature in MediaWiki, though: mediawiki.feedback.js. It's just a JavaScrip popup which saves the comment to a page on the wiki.
>
This doesn’t seem very practical, for the reasons Andrew outlines above.
>> many of
>> which would not otherwise comment at all. And for some tools (such as
>> UploadWizard) there is no obvious place to leave comments, and opening
>> Bugzilla is a new-tab + multi-step process away.
>
> UploadWizard (like VisualEditor) uses what above. Maybe it needs an option to be offered more prominently under some conditions?
> This is probably the most viable option here, almost no technical effort and more value in output.
>
> Tilman Bayer, 29/04/2014 21:58:
> > might be worth revisiting
> > LimeSurvey, which appears to have undergone a complete rewrite since
> > that installation was removed from WMF servers for security concerns
> > around 2011.
>
> +1. It will need to be done anyway, at some point, e.g. if a general editor survey is tried again.
>
I support the idea of talking to the LimeSurvey developers, to investigate this further.
But I suspect that some development may be required to meet our requirements, as outlined above.
> > multilingual
> > support than other solutions [...]
> > lack of integrated language support in
> > Surveymonkey, or just because the focus was on per-project results
> > anyway?
>
> Agreed on all the rest but this point specifically. It seems surveymonkey is really out of question. However, how many languages does Qualtrics support?
Why do you say that Survey Monkey is out of the question? We have used it successfully for other projects before.
> LimeSurvey says 50; it is translated on a public instance of GlotPress. GlotPress is from Automattic and is used to make some Wordpress locale, hence some translatewiki.net have experience with it. However I wasn't able to gather much information about it, I only know that it's yet another web tool for .po format; maybe Stu can put us in contact with someone with more insight (especially on how much it's used and how prioritary for Automattic)?
>
Keep in mind that language support typically means that the standard buttons and error messages are translated in a variety of languages by most platform providers.
All survey contents have to be translated separately. For our purposes, we have successfully used onwiki translation tools to get our surveys translated in other languages.
> Nemo
On Apr 29, 2014, at 8:35 PM, Samuel Klein <meta.sj(a)gmail.com> wrote:
> I think very very simple surveys -- or a standard brightly colored
> "Feedback" button that's visible from a tool's page -- are useful.
>
> Asking 1/1000 users of tool X a single open-ended question ("please
> give us feedback on X" or "how is X working for you"?) can be a handy
> way to encourage brief input from a cross-section of users, many of
> which would not otherwise comment at all. And for some tools (such as
> UploadWizard) there is no obvious place to leave comments, and opening
> Bugzilla is a new-tab + multi-step process away.
>
Thanks, SJ.
You make some very good points, which I agree with.
> SJ
>
> On Tue, Apr 29, 2014 at 3:47 PM, Mark Holmquist <mtraceur(a)member.fsf.org> wrote:
>> On Tue, Apr 29, 2014 at 12:30:57PM -0700, Oliver Keyes wrote:
>>> Geneally speaking my advice to the multimedia team would be "don't go near
>>> surveys". I've done a lot of them in the last 3 years, and the one thing
>>> I've learned is that surveys are very, very difficult to get right. Another
>>> thing I've learned is that if you don't get them right, the results are
>>> meaningless and it's hard to tell when that happens.
>>>
>>> As I understand it, Jared's team is hiring a qualitatively-focused UX
>>> researcher or two in the upcoming budget to do research around design and
>>> feature usage; we should hold off until they come in, first because they're
>>> simply going to be better at it than we are, and second because it's
>>> probably going to be frustrating for them if they come in and find a tool
>>> locked in as How We Do Things (and frustrating for us if they want to
>>> change that tool):
>>
>> Hi, Multimedia list!
>>
>> Just thought I'd cross-post this reply to my call for help with surveys
>> [0] on the analytics list.
>>
>> I'm sort of of the mind that skipping the survey for UploadWizard is a
>> good idea, especially now that I've thought about it more - using a survey
>> from a third-party site is silly for logged-in users, because logged-in
>> users will know how to use the talk pages and/or bugzilla.
>>
>> Thoughts?
>>
>> [0] http://lists.wikimedia.org/pipermail/analytics/2014-April/001911.html
>>
>> --
>> Mark Holmquist
>> Software Engineer, Multimedia
>> Wikimedia Foundation
>> mtraceur(a)member.fsf.org
>> https://wikimediafoundation.org/wiki/User:MHolmquist
>>
>>
>> _______________________________________________
>> Multimedia mailing list
>> Multimedia(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/multimedia
>>
>
>
>
> --
> Samuel Klein @metasj w:user:sj +1 617 529 4266
>
> _______________________________________________
> Multimedia mailing list
> Multimedia(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/multimedia
_______________________________
Fabrice Florin
Product Manager
Wikimedia Foundation
http://en.wikipedia.org/wiki/User:Fabrice_Florin_(WMF)