On Mon, Oct 12, 2015 at 12:50 PM, Volker Eckl <veckl@wikimedia.org> wrote:
Google will give an introduction on AMP at SFHTML5 Meetup on 23 Oct:

I'm in.

FWIW, Google appears to be serious about this:
http://searchengineland.com/google-amp-coming-rank-fast-238046

Luis
 
On Fri, Oct 9, 2015 at 6:43 PM, Jon Katz <jkatz@wikimedia.org> wrote:

On Fri, Oct 9, 2015 at 12:34 AM, Joaquin Oltra Hernandez <jhernandez@wikimedia.org> wrote:
If you really wanted to, you can subset what you send to mobile browsers and get the same benefits (provided you use a really good CDN).

I think this announcement + the transcoding work Google is doing show that this ^ is something we should be strongly considering. If google can transcode our content and make it significantly faster (as Gergo showed in another thread) and/or other sites are adopting similar technology, than our users are going to expect a level of speed far higher than we can currently provide.  I don't care if we use google's or our own, but do want to make sure we aren't rebuilding the wheel if we don't have to.

The conversations as to whether or not google is acting out of self interest are fairly moot (they are...always), but I think Luis's points are very apt about googles self interests being more closely aligned with ours on the web than the other big players in this space.

On Fri, Oct 9, 2015 at 9:27 AM, Luis Villa <lvilla@wikimedia.org> wrote:
On Thu, Oct 8, 2015 at 5:39 PM, Toby Negrin <tnegrin@wikimedia.org> wrote:
Hi Luis --

I honestly don't see a lot of difference between Google, Twitter and Facebook, since they are all ad supported entities with a fiscal responsibility to track their users and sell the data. Apple's a bit different on the surface since they have a different business model. I agree that these are bad for the internet but so are incredibly slow web pages that make apps essentially required for a good experience.

I agree that the companies all have (essentially[1]) the same motives at the company level. The difference is that Google's technical approach to solving the latency problem is not explicitly tied to Google or to particular Google apps. (There is a pure web demo, for example, which works in any mobile browser, including Firefox for Android, and Twitter - a Google competitor - has already adopted it.) In contrast, Facebook and Apple's "solutions" for fast reading are very explicitly tied to (1) apps, not browsers, and (2) apps specifically from those companies. There will never be a future where Facebook's solution for latency works outside of Facebook; there is (at least in theory, and possibly in practice w/ Twitter) such a future with AMP.

Or to put it another way: Google's solution still might not be good, but it's at least possible that it could keep content on the open web; Facebook and Apple are pretty explicitly trying to kill the open web. There is no way the long game of the FB/Apple apps lead to good outcomes for independent publishers like us.
 
On the analytics, this would probably not include their use of our content in the knowledge graph or elsewhere

Oh, definitely won't. But it might give us some leverage in those discussions - having conceded that the analytics from some cached pages should be shared, it is no longer such a huge leap to analytics on other types of "cached"/processed data.
 
and also might be troublesome for those who prefer google not to track their reading.

There is a lot of devil in those details, of course, but for those coming from Google Search (still the vast majority of our users) the first leap is already tracked/known to Google. This doesn't necessarily make that worse. (Much depends on how the caching occurs; their ability to track the *second* page you read would be new, at least for iOS users - Android users already have this problem, I believe.)
 
Bryan's ticket is a good embarkation point for thinking about supporting new clients; Reading is also planning some Reading infrastructure work for the summit which could relate[1]

Great link, thanks.
 
[1] The subtle difference, from our perspective, is that Google has pretty strong incentives to keep the open web viable, because making sense of (and selling ads on) the open web is their core competence. Facebook and Apple, in contrast, have no strategic reason to keep the open web viable: if they can turn every publisher into a FB-only or Apple-only publisher, they'd happily do that. Of course, an open web that doesn't depend on Google would be even better, but that's not in the cards at this point unless Mozilla wakes up.




 





On Thu, Oct 8, 2015 at 2:02 PM, Luis Villa <lvilla@wikimedia.org> wrote:
Toby - 

I'm generally 1000% on-board with slow follower for anything user-facing. The only reason I might make an exception here is because the competitors you mention are all pretty awful for the web generally, and this has uptake already from Google and Twitter. (Two isn't great, but two + slim opportunity for growth is way better than the guaranteed never-greater-than-1 we'll see from FB's option.)

The other reason this intrigues me is that if Google builds in some analytics, it might give us a better sense of their current usages for us than we currently have. Not much, obviously, but at least something. (Remember that in this scenario - direct access from Google properties - they already have all that information, the only question is whether it gets shared with us so that we can do something useful with it.)

That said, if implementing it is non-trivial, it doesn't make sense to spend a huge number of cycles to fast-follow. Hopefully some of the improvements Bryan mentions will make it easier in the future - it certainly doesn't look like we're in a world where the number of front ends is going to get smaller any time soon.

Luis

On Thu, Oct 8, 2015 at 1:25 PM, Toby Negrin <tnegrin@wikimedia.org> wrote:
Thanks Bryan and Pine. 

My feeling is that there are many many new interfaces and form factors emerging right now and we should be cautious about adoption. For example Facebook's instant articles, apple news and even snapchat have similar offerings the AMP. 

They all seem to be focusing on article speed in a landscape where most pages are larded up with a variety of trackers, ads and other scripts (which we don't have, although we have our own challenges on performance) with the ultimate goal of owning the delivery platform. 

I'm nervous about picking winners in such a landscape although I'm excited about prototypes like things like the Apple Watch app that came out of the Lyon hackathon. I feel like a slow follower model where we see which solution if any becomes widely used is more appropriate for us.

-Toby


On Thursday, October 8, 2015, Pine W <wiki.pine@gmail.com> wrote:

Hi Bryan,

Ah, I was thinking of the 2 different mobile web editing experiences (not 2 different apps) for Android depending on form factor. My understanding is that tablets have VE enabled on mobile web now (I have yet to try it) while phones do not have VE enabled on mobile web yet.

Pine

On Oct 8, 2015 12:56 PM, "Bryan Davis" <bd808@wikimedia.org> wrote:
On Thu, Oct 8, 2015 at 1:32 PM, Pine W <wiki.pine@gmail.com> wrote:
> We currently have at least 6 channels, I believe:
>
> 1. Desktop Web
> 2. Mobile Web
> 3. Android phone
> 4. Android tablet

I don't think that we have separate native apps for the phone and
tablet form factors.

> 5. IPhone
> 6. Legacy Android phone
>
> I'm no expert on mobile developmemt, but perhaps WMF could experiment with
> Google's idea on just one channel to start?

AMP would only be appropriate for the mobile web channel from the list
above. Implementing it would be a matter of placing some sort of
translating proxy between MediaWiki and the requesting user agent that
downgraded the HTML produced by MediaWiki to AMP's restricted HTML
dialect. That sort of translation might be possible in MobileFrontend
but it would likely be accomplished much more easily using some other
tech stack that had good support for manipulation of HTML like a
node.js service. It might be an interesting prototype project for a
volunteer to experiment with a frontend app that consumed the RESTBase
provided Parsoid HTML (e.g.
https://en.wikipedia.org/api/rest_v1/page/html/NOFX) and spit out AMP
compliant documents.

The only other option really to produce alternate HTML from MediaWiki
would require swapping out the existing layer that translates an
article's wikitext to HTML with a version that spoke AMP instead. That
would be related to https://phabricator.wikimedia.org/T114194.

Bryan
--
Bryan Davis              Wikimedia Foundation    <bd808@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software Engineer            Boise, ID USA
irc: bd808                                        v:415.839.6885 x6855

_______________________________________________
Mobile-l mailing list
Mobile-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mobile-l




--
Luis Villa
Sr. Director of Community Engagement
Wikimedia Foundation
Working towards a world in which every single human being can freely share in the sum of all knowledge.




--
Luis Villa
Sr. Director of Community Engagement
Wikimedia Foundation
Working towards a world in which every single human being can freely share in the sum of all knowledge.

_______________________________________________
Mobile-l mailing list
Mobile-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mobile-l



_______________________________________________
Mobile-l mailing list
Mobile-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mobile-l





--
Luis Villa
Sr. Director of Community Engagement
Wikimedia Foundation
Working towards a world in which every single human being can freely share in the sum of all knowledge.