You have probably heard about CO2 and the conference being held these days in Copenhagen (1).
You have probably heard about the goal of carbon neutrality at the Wikimania conference in Gdansk in July 2010 (2).
You may want to discuss the basic and perhaps naive wishes I have written down on the strategy wiki about paper consumption (3).
Do we have an idea of the energy consumption related to the online access to a Wikipedia article ? Some people say that a few minutes long search on a search engine costs as much energy as boiling water for a cup of tea : is that story true in the case of Wikipedia (4) ?
How about moving the servers (5) from Florida to a cold country (Alaska, Canada, Finland, Russia) so that they can be used to heat offices or homes ? It might not be unrealistic as one may read such things as "the solution was to provide nearby homes with our waste heat" (6).
(1) http://en.wikipedia.org/wiki/United_Nations_Climate_Change_Conference_2009 (2) http://meta.wikimedia.org/wiki/Wikimania_2010/Bids/Gda%C5%84sk#Environmental... (3) http://strategy.wikimedia.org/wiki/Proposal:Environmental_policy_for_paper_p... (4) http://technology.timesonline.co.uk/tol/news/tech_and_web/article5489134.ece (5) http://meta.wikimedia.org/wiki/Wikimedia_servers (6) http://www.greenercomputing.com/news/2009/12/08/giant-data-center-heat-londo...
On Sat, Dec 12, 2009 at 5:32 PM, Teofilo teofilowiki@gmail.com wrote:
Do we have an idea of the energy consumption related to the online access to a Wikipedia article ? Some people say that a few minutes long search on a search engine costs as much energy as boiling water for a cup of tea : is that story true in the case of Wikipedia (4) ?
my 2 cents : this php is cooking more cups of tea than an optimized program written in c.
Дана Saturday 12 December 2009 17:41:44 jamesmikedupont@googlemail.com написа:
On Sat, Dec 12, 2009 at 5:32 PM, Teofilo teofilowiki@gmail.com wrote:
Do we have an idea of the energy consumption related to the online access to a Wikipedia article ? Some people say that a few minutes long search on a search engine costs as much energy as boiling water for a cup of tea : is that story true in the case of Wikipedia (4) ?
my 2 cents : this php is cooking more cups of tea than an optimized program written in c.
But think of all the coffee developers would have to cook while coding and optimizing in C!
On Sun, Dec 13, 2009 at 10:30 AM, Nikola Smolenski smolensk@eunet.rs wrote:
Дана Saturday 12 December 2009 17:41:44 jamesmikedupont@googlemail.com написа:
On Sat, Dec 12, 2009 at 5:32 PM, Teofilo teofilowiki@gmail.com wrote:
Do we have an idea of the energy consumption related to the online access to a Wikipedia article ? Some people say that a few minutes long search on a search engine costs as much energy as boiling water for a cup of tea : is that story true in the case of Wikipedia (4) ?
my 2 cents : this php is cooking more cups of tea than an optimized program written in c.
But think of all the coffee developers would have to cook while coding and optimizing in C!
But that is a one off expense. That is why we programmers can earn a living, because we can work on many projects. Also we drink coffee while playing UrbanTerror as well.
1. Php is very hard to optimize. 2. The mediawiki has a pretty nonstandard syntax. The best that I have seen is the python implementation of the wikibook parser. But given that each plugin can change the syntax as it will, it will get more complex. 3. Even python is easier to optimize than php. 4. The other questions are, does it make sense to have such a centralized client server architecture? We have been talking about using a distributed vcs for mediawiki. 5. Well, now even if the mediawiki is fully distributed, it will cost CPU, but that will be distributed. Each edit that has to be copied will cause work to be done. In a distributed system even more work in total. 6. Now, I have been wondering anyway who is the benefactor of all these millions spend on bandwidth, where do they go to anyway? What about making a wikipedia network and have the people who want to access it pay instead of having us pay to give it away? With these millions you can buy a lot of routers and cables. 7. Now, back to the optimization. Lets say you were able to optimize the program. We would identify the major cpu burners and optimize them out. That does not solve the problem. Because I would think that the php program is only a small part of the entire issue. The fact that the data is flowing in a certain wasteful way is the cause of the waste, not the program itself. Even if it would be much more efficient and moving around data that is not needed, the data is not needed.
This would eventually lead, in an optimal world to updates not even being distributed at all. Not all changes have to be centralized. Lets say that there is one editor who would pull the changes from others and make a public version. That would mean that only they would need to have all data for that one topic. I think that you could optimize the wikipedia along the lines of data travelling only to the people who need it (editors versus viewers) and you would optimize first a way to route edits into special interest groups and create smaller virtual subnetworks of the editors CPUs working together in a peer to peer direct network.
So if you have 10 people collaborating on a topic, only the results of that work will be checked into the central server. the decentralized communication would be between fewer parties and reduce the resources used.
see also : http://strategy.wikimedia.org/wiki/Proposal:A_MediaWiki_Parser_in_C
mike
Hi!!!
- Php is very hard to optimize.
No, PHP is much easier to optimize (read - performance oriented refactoring).
- Even python is easier to optimize than php.
Python's main design idea is readability. What is readable, is easier to refactor too, right? :)
- The other questions are, does it make sense to have such a
centralized client server architecture? We have been talking about using a distributed vcs for mediawiki.
Lunatics without any idea of stuff being done inside the engine talk about distribution. Let them!
- Well, now even if the mediawiki is fully distributed, it will cost
CPU, but that will be distributed. Each edit that has to be copied will cause work to be done. In a distributed system even more work in total.
Indeed, distribution raises costs.
- Now, I have been wondering anyway who is the benefactor of all
these millions spend on bandwidth, where do they go to anyway? What about making a wikipedia network and have the people who want to access it pay instead of having us pay to give it away? With these millions you can buy a lot of routers and cables.
LOL. There's quite some competition in network department, and it has become economy of scale (or of serving youtube) long ago.
- Now, back to the optimization. Lets say you were able to optimize
the program. We would identify the major cpu burners and optimize them out. That does not solve the problem. Because I would think that the php program is only a small part of the entire issue. The fact that the data is flowing in a certain wasteful way is the cause of the waste, not the program itself. Even if it would be much more efficient and moving around data that is not needed, the data is not needed.
We can have new kind of Wikipedia. The one where we serve blank pages, and people imagine content in it. We\ve done that with moderate success quite often.
So if you have 10 people collaborating on a topic, only the results of that work will be checked into the central server. the decentralized communication would be between fewer parties and reduce the resources used.
Except that you still need tracker to handle all that, and resolve conflicts, as still, there're no good methods of resolving conflicts with small number of untrusted entities.
see also : http://strategy.wikimedia.org/wiki/Proposal:A_MediaWiki_Parser_in_C
How much would that save?
Domas
Let me sum this up, The basic optimization is this : You don't need to transfer that new article in every revision to all users at all times. The central server could just say : this is the last revision that has been released by the editors responsible for it, there are 100 edits in process and you can get involved by going to this page here (hosted on a server someplace else). There is no need to transfer those 100 edits to all the users on the web and they are not interesting to everyone.
On Sun, Dec 13, 2009 at 12:10 PM, Domas Mituzas midom.lists@gmail.com wrote:
- The other questions are, does it make sense to have such a
centralized client server architecture? We have been talking about using a distributed vcs for mediawiki.
Lunatics without any idea of stuff being done inside the engine talk about distribution. Let them!
I hope you are serious here, Lets take a look at what the engine does, it allows editing of text. It renders the text. It serves the text. The wiki from ward cunningham is a perl script of the most basic form. There is not much magic involved. Of course you need search tools, version histories and such. There are places for optimizing all of those processes.
It is not lunacy, it is a fact that such work can be done, and is done without a central server in many places.
Just look at for example how people edit code in an open source software project using git. It is distributed, and it works.
There are already wikis based on git available. There are other peer to peer networks such as TOR or freenet that would be possible to use.
If you were to split up the editing of wikipedia articles into a network of git servers across the globe and the rendering and distribution of the resulting data would be the job of the WMF.
Now the issue of resolving conflicts is pretty simple in the issue of git, everyone has a copy and can do what they want with it. If you like the version from someone else, you pull it.
In terms of wikipedia as having only one viewpoint, the NPOV that is reflected by the current revision at any one point in time, that version would be one pushed from its editors repositories. It is imaginable that you would have one senior editor for each topic who has their own repository of of pages who pull in versions from many people.
- Now, back to the optimization. Lets say you were able to optimize
the program. We would identify the major cpu burners and optimize them out. That does not solve the problem. Because I would think that the php program is only a small part of the entire issue. The fact that the data is flowing in a certain wasteful way is the cause of the waste, not the program itself. Even if it would be much more efficient and moving around data that is not needed, the data is not needed.
We can have new kind of Wikipedia. The one where we serve blank pages, and people imagine content in it. We\ve done that with moderate success quite often.
Please lets be serious here! I am talking about the fact that not all people need all the centralised services at all times.
So if you have 10 people collaborating on a topic, only the results of that work will be checked into the central server. the decentralized communication would be between fewer parties and reduce the resources used.
Except that you still need tracker to handle all that, and resolve conflicts, as still, there're > no good methods of resolving conflicts with small number of untrusted entities.
A tracker to manage what server is used for what group of editors can be pretty efficient. Essentially it is a form of DNS. A tracker need only show you the current repositories that are registered for a certain topic.
Resolving conflicts is important, but you only need so many people for that.
The entire community does not get involved in all the conflicts. There are only a certain number of people that are deeply involved in any one section of the wikipedia at any given time.
Imagine that you had, lets say 1000 conference rooms available for discussion and working together spread around the world and the results of those rooms would be fed back into the Wikipedia. These rooms or servers would be for processing the edits and conflicts any given set of pages.
My idea is that you don't need to have a huge server to resolve conflicts. many pages don't have many conflicts, there are certain areas which need constant arbitration of course. Even if you split up the groups into different viewpoints where the arbitration team only deals with the output of two teams (pro and contra).
Even if you look at the number of editors in a highly contested page, they are not unlimited.
From the retrospective you would be able to identify what groups of
editors are collaborating (enhancing each other) and conflicting (overwriting each other). If you split them up into different rooms when they should be collaborating and reduce the conflicts, then you will win alot.
Even in Germany, most edits do not show up immediately. They have some person to check the commits. Now that would also mean that those edits before they are commited do not need to go a single data center.
People interested in getting all the versions available would need to be able to find them. But for stuff like that people would be prepared to wait a bit longer to collect the data from many servers if needed. You should be able to just pull the versions you want in the depth that you want. That selection of versions and depth would be a large optimization in its self.
So there are different ways to reduce the load on a single server and create pockets of processing for different topics. The only really important thing is that people who are working on the same topic are working on the same server or have a path of communication.
To sum it up, if conflicts are the major problem in the wikipedia, the major cost in terms of review and coordination, then you should rethink the workflow to push the processing time back to the editor causing the conflict.
Right now the revisions are stored in whole, but not in part. If you only add in new information then the you need less storage. That would be one big optimization for the wikipedia to transfer only the changes across the net and not full revisions.
For course even a new section could be a conflict if the new text is garbage or in need of editing. If you want to replace a single word or a sentence then lets say would create a conflict branch in one of external conference rooms that would be the host of the page until the work is finished there. The main server would just have a pointer to the workgroup and the load would be pushed away. That also means that any local server would be able to process the data and host the branch until it is pushed back to the main server.
OK, well I think this is enough for now. I do ask you to remain serious, and we can have a serious discussion on the topic of optimisation.
thanks, mike
Dude, I need that strong stuff you're having.
Let me sum this up, The basic optimization is this : You don't need to transfer that new article in every revision to all users at all times.
There's not much difference between transferring every revision and just some 'good' revisions.
The central server could just say : this is the last revision that has been released by the editors responsible for it, there are 100 edits in process and you can get involved by going to this page here (hosted on a server someplace else).
Editing is miniscule part of our workload.
There is no need to transfer those 100 edits to all the users on the web and they are not interesting to everyone.
Well, we may not transfer them, in case of flagged revisions, we can transfer in case of pure wiki. Point is, someone has to transfer.
Lets take a look at what the engine does, it allows editing of text.
That includes conflict resolution, cross-indexing, history tracking, abuse filtering, full text indexing, etc.
It renders the text.
It means building the output out of many individual assets (templates, anyone?), embed media, transform based on user options, etc.
It serves the text.
And not only text - it serves complex aggregate views like 'last related changes', 'watchlist', 'contributions by new users', etc.
The wiki from ward cunningham is a perl script of the most basic form.
That is probably one of reasons why we're not using wiki from Ward Cunningham anymore, and have something else, called Mediawiki.
There is not much magic involved.
Not much use at multi-million article wiki with hundreds of millions of revisions.
Of course you need search tools, version histories and such. There are places for optimizing all of those processes.
And we've done that with MediaWiki ;-)
It is not lunacy, it is a fact that such work can be done, and is done without a central server in many places.
Name me a single website with distributed-over-internet backend.
Just look at for example how people edit code in an open source software project using git. It is distributed, and it works.
Git is limited and expensive for way too many of our operations. Also, you have to have whole copy of GIT, it doesn't have on-demand-remote-pulls nor any caching layer attached to that. I appreciate your will of cloning Wikipedia.
It works if you want expensive accesses, of course. We're talking about serving a website here, not a case which is very nicely depicted at: http://xkcd.com/303/
There are already wikis based on git available.
Anyone tried putting Wikipedia content on them, and try simulating our workload? :) I understand that Git's semantics are usable for Wikipedia's basic revision storage, but it's data would still have to be replicated to other types of storages, that would allow various cross-indexing and cross-reporting.
How well does Git handle parallelism internally? How can it be parallelized over multiple machines? etc ;-) It lacks engineering. Basic stuff is nice, but it isn't what we need.
There are other peer to peer networks such as TOR or freenet that would be possible to use.
How? These are just transports.
If you were to split up the editing of wikipedia articles into a network of git servers across the globe and the rendering and distribution of the resulting data would be the job of the WMF.
And how would that save any money? By adding much more complexity to most of processes, and by having major cost item untouched?
Now the issue of resolving conflicts is pretty simple in the issue of git, everyone has a copy and can do what they want with it. If you like the version from someone else, you pull it.
Who's revision does Wikimedia merge?
In terms of wikipedia as having only one viewpoint, the NPOV that is reflected by the current revision at any one point in time, that version would be one pushed from its editors repositories. It is imaginable that you would have one senior editor for each topic who has their own repository of of pages who pull in versions from many people.
Go to Citizendium, k, thx.
Please lets be serious here! I am talking about the fact that not all people need all the centralised services at all times.
You have absolute misunderstanding on what our technology platform is doing. You're wasting your time, you're wasting my time, you're wasting time of everyone who has to read your or my emails.
A tracker to manage what server is used for what group of editors can be pretty efficient. Essentially it is a form of DNS. A tracker need only show you the current repositories that are registered for a certain topic.
Seriously, need that stuff you're on. Have you ever been involved in building anything remotely similar?
The entire community does not get involved in all the conflicts. There are only a certain number of people that are deeply involved in any one section of the wikipedia at any given time.
Have you ever edited Wikipedia? :) You understand editorial process there?
Imagine that you had, lets say 1000 conference rooms available for discussion and working together spread around the world and the results of those rooms would be fed back into the Wikipedia. These rooms or servers would be for processing the edits and conflicts any given set of pages.
How is that more efficient?
My idea is that you don't need to have a huge server to resolve conflicts. many pages don't have many conflicts, there are certain areas which need constant arbitration of course. Even if you split up the groups into different viewpoints where the arbitration team only deals with the output of two teams (pro and contra).
NEED YOUR STUFFFFFF.
From the retrospective you would be able to identify what groups of editors are collaborating (enhancing each other) and conflicting (overwriting each other). If you split them up into different rooms when they should be collaborating and reduce the conflicts, then you will win alot.
You'll get Nobel prize of literature if you continue so! Infinite monkeys, when managed properly, ... ;-)
Even in Germany, most edits do not show up immediately. They have some person to check the commits. Now that would also mean that those edits before they are commited do not need to go a single data center.
Again, you don't win efficiency. You win 'something', like, bragging rights in your local p2p-wanking-circle. This part of editorial process is miniscule in terms of workload.
You should be able to just pull the versions you want in the depth that you want. That selection of versions and depth would be a large optimization in its self.
Except that it is not the cost for us.
So there are different ways to reduce the load on a single server and create pockets of processing for different topics. The only really important thing is that people who are working on the same topic are working on the same server or have a path of communication.
YOU SHOULD MENTION JABBER!!!111oneoneeleven
To sum it up, if conflicts are the major problem in the wikipedia, the major cost in terms of review and coordination, then you should rethink the workflow to push the processing time back to the editor causing the conflict.
Semi-atomic resolution of conflicts is what allows fast collaboration to happen. You fail to understand that.
Right now the revisions are stored in whole, but not in part. If you only add in new information then the you need less storage. That would be one big optimization for the wikipedia to transfer only the changes across the net and not full revisions.
??????
OK, well I think this is enough for now. I do ask you to remain serious, and we can have a serious discussion on the topic of optimisation.
I am serious. You fail at everything.
You fail to understand online operation implications (privacy, security, etc) You fail to understand our content. You fail to understand our costs You fail to understand our archival and cross-indexing needs You fail to understand our editorial process efficiency You fail to understand that distribution increases overall costs.
You fail to understand pretty much everything.
I admire your enthusiasm of 'scaling basic wiki'. We're not running basic wiki, we're way beyond that. I have no idea how I can have serious discussion with someone who is so out of reality. You suggest high complexity engineering project, that would bring nearly no wins over anything. At this point you should erase your email client, that would much more efficient.
I deliberately keep this topic on foundation-l, because I'm sure it is not worth the time of people on wikitech-l@ ;-)
Domas
2009/12/12 Teofilo teofilowiki@gmail.com:
How about moving the servers (5) from Florida to a cold country (Alaska, Canada, Finland, Russia) so that they can be used to heat offices or homes ? It might not be unrealistic as one may read such things as "the solution was to provide nearby homes with our waste heat" (6).
Alaska has seriously expensive construction systems and the others listed have unacceptable legal systems.
On 12/12/2009 08:32 AM, Teofilo wrote:
Do we have an idea of the energy consumption related to the online access to a Wikipedia article ? Some people say that a few minutes long search on a search engine costs as much energy as boiling water for a cup of tea : is that story true in the case of Wikipedia (4) ?
I don't have time to do the math right now, but I believe this could be estimated from publicly available data. You'd take the pageview numbers:
http://stats.wikimedia.org/wikimedia/squids/SquidReportRequests.htm
You'd look up our various servers:
http://wikitech.wikimedia.org/view/Main_Page
And then make some reasonable guesses as to actual power consumption. (Sysadmins often measure this, so I'm sure some Googling would turn up good approximations.) Divide one number by the other and you've got a reasonably good guess at power usage per pageview.
You could take that a step farther by looking up the power composition where the server farms are and estimating CO2 output.
If anybody tries to do this and gets stuck, drop me a line.
William
The only reason the servers and internet access produce CO2 emissions is because of the defective and antiquated energy production systems we use across the world. As we move towards more efficient and "cleaner" means of energy production, the carbon footprint should decrease.
Moving servers to Scandinavia would be interesting, but a unsound logistical idea. I agree that it would be a effective reuse of energy, but I am concerned about the access problem of relocating assets in one region. Now, placing new servers in Scandinavia on a grid so that the energy production can be reused is not a bad idea, but would be something for the chapters there to look at.
With regards to Florida, if the servers are in an office building, one way to decrease costs might be to reconfigure the environmental systems to use the energy from the servers to heat/cool the building. Wikimedia would then be able to recoup part of the utility bills from surrounding tenants.
However, engineering input would be most beneficial to considering these interesting proposals.
Geoffrey
________________________________ From: Teofilo teofilowiki@gmail.com To: foundation-l@lists.wikimedia.org Sent: Sat, December 12, 2009 8:32:12 AM Subject: [Foundation-l] Wikimedia and Environment
You have probably heard about CO2 and the conference being held these days in Copenhagen (1).
You have probably heard about the goal of carbon neutrality at the Wikimania conference in Gdansk in July 2010 (2).
You may want to discuss the basic and perhaps naive wishes I have written down on the strategy wiki about paper consumption (3).
Do we have an idea of the energy consumption related to the online access to a Wikipedia article ? Some people say that a few minutes long search on a search engine costs as much energy as boiling water for a cup of tea : is that story true in the case of Wikipedia (4) ?
How about moving the servers (5) from Florida to a cold country (Alaska, Canada, Finland, Russia) so that they can be used to heat offices or homes ? It might not be unrealistic as one may read such things as "the solution was to provide nearby homes with our waste heat" (6).
(1) http://en.wikipedia.org/wiki/United_Nations_Climate_Change_Conference_2009 (2) http://meta.wikimedia.org/wiki/Wikimania_2010/Bids/Gda%C5%84sk#Environmental... (3) http://strategy.wikimedia.org/wiki/Proposal:Environmental_policy_for_paper_p... (4) http://technology.timesonline.co.uk/tol/news/tech_and_web/article5489134.ece (5) http://meta.wikimedia.org/wiki/Wikimedia_servers (6) http://www.greenercomputing.com/news/2009/12/08/giant-data-center-heat-londo...
_______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2009/12/12 Geoffrey Plourde geo.plrd@yahoo.com:
The only reason the servers and internet access produce CO2 emissions is because of the defective and antiquated energy production systems we use across the world. As we move towards more efficient and "cleaner" means of energy production, the carbon footprint should decrease. Moving servers to Scandinavia would be interesting, but a unsound logistical idea. I agree that it would be a effective reuse of energy, but I am concerned about the access problem of relocating assets in one region. Now, placing new servers in Scandinavia on a grid so that the energy production can be reused is not a bad idea, but would be something for the chapters there to look at.
Iceland! Geothermal energy!
- d.
W dniu 12.12.2009 22:36, David Gerard pisze:
2009/12/12 Geoffrey Plourdegeo.plrd@yahoo.com:
The only reason the servers and internet access produce CO2 emissions is because of the defective and antiquated energy production systems we use across the world. As we move towards more efficient and "cleaner" means of energy production, the carbon footprint should decrease. Moving servers to Scandinavia would be interesting, but a unsound logistical idea. I agree that it would be a effective reuse of energy, but I am concerned about the access problem of relocating assets in one region. Now, placing new servers in Scandinavia on a grid so that the energy production can be reused is not a bad idea, but would be something for the chapters there to look at.
Iceland! Geothermal energy!
but we need to cool not to heat our servers :)
masti DataCenter Manager :)
2009/12/12, Geoffrey Plourde geo.plrd@yahoo.com:
With regards to Florida, if the servers are in an office building, one way to >decrease costs might be to reconfigure the environmental systems to use the >energy from the servers to heat/cool the building. Wikimedia would then be able >to recoup part of the utility bills from surrounding tenants.
I am not sure the laws of thermodynamics (1) would allow to use that heat to cool a building. You would need a cold source like a river to convert heat back into electricity. But it might be more cost efficient to have the water from the river circulate directly into the building, so that your extra heat is still remaining unused.
This is why I think it is more difficult to find solutions in a hot country like Florida than in a cold country (as long as you don't question the very existence of heated homes in cold countries, leaving aside the possibility of moving people and their homes from cold to warm countries).
(1) http://en.wikipedia.org/wiki/Laws_of_thermodynamics#Second_law
On Sat, Dec 12, 2009 at 11:32 AM, Teofilo teofilowiki@gmail.com wrote:
How about moving the servers (5) from Florida to a cold country (Alaska, Canada, Finland, Russia) so that they can be used to heat offices or homes ? It might not be unrealistic as one may read such things as "the solution was to provide nearby homes with our waste heat" (6).
I imagine the average Wikimedia user is more concerned with whether his requests are optimized to be fast than with whether they're optimized to be environmentally friendly. Or, to add a coat of greenwash, remember that power consumption is going to be greater if you have more latency.
If the WMF had $130 million lying around, I would rather they used it to actually serve their mission.
I think Domas hit the nail on the head in May: http://lists.wikimedia.org/pipermail/foundation-l/2009-May/051656.html
Teofilo wrote:
Do we have an idea of the energy consumption related to the online access to a Wikipedia article ? Some people say that a few minutes long search on a search engine costs as much energy as boiling water for a cup of tea : is that story true in the case of Wikipedia (4) ?
How about moving the servers (5) from Florida to a cold country (Alaska, Canada, Finland, Russia) so that they can be used to heat offices or homes ? It might not be unrealistic as one may read such things as "the solution was to provide nearby homes with our waste heat" (6).
Heh. That brings some old memories right front to center.
I used to be a hang-around member of this hacker collective here in Finland. (Intsu -> The Hole -> Cute - as its designation evolved)
While in The Hole phase, we had a rented basement space, and turned down all the heating in the space, after receiving a donated old mainframe. I think you can guess why.
Yours,
Jussi-Ville Heiskanen
On Sat, Dec 12, 2009 at 5:32 PM, Teofilo teofilowiki@gmail.com wrote:
How about moving the servers (5) from Florida to a cold country (Alaska, Canada, Finland, Russia) so that they can be used to heat offices or homes ? It might not be unrealistic as one may read such things as "the solution was to provide nearby homes with our waste heat" (6).
I don't think that's a practical solution. It's not because they need to be cooled that computers cost so much energy - rather the opposite: they use much energy, and because energy cannot be created or destroyed, this energy has to go out some way - and that way is heat.
2009/12/13, Andre Engels andreengels@gmail.com:
I don't think that's a practical solution. It's not because they need to be cooled that computers cost so much energy - rather the opposite: they use much energy, and because energy cannot be created or destroyed, this energy has to go out some way - and that way is heat.
In cold countries, energy can have two lives : a first life making calculations in a computer, or transforming matter (ore into metal, trees into books), and a second life heating homes.
But the best is to use no energy at all : see the OLPC project in Afghanistan (A computer with pedals, like the sewing machines of our great-great-great-grand-mothers) (1)
(1) http://www.olpcnews.com/countries/afghanistan/updates_from_olpc_afghanistan_...
2009/12/13 Teofilo teofilowiki@gmail.com:
But the best is to use no energy at all : see the OLPC project in Afghanistan (A computer with pedals, like the sewing machines of our great-great-great-grand-mothers) (1) (1) http://www.olpcnews.com/countries/afghanistan/updates_from_olpc_afghanistan_...
That's the answer! Distributed serving by each volunteer's pedal power!
- d.
On Sun, Dec 13, 2009 at 1:22 PM, David Gerard dgerard@gmail.com wrote:
2009/12/13 Teofilo teofilowiki@gmail.com:
But the best is to use no energy at all : see the OLPC project in Afghanistan (A computer with pedals, like the sewing machines of our great-great-great-grand-mothers) (1) (1) http://www.olpcnews.com/countries/afghanistan/updates_from_olpc_afghanistan_...
That's the answer! Distributed serving by each volunteer's pedal power!
And you automatically become an admin after 5MW!
Magnus
Hi!
In cold countries, energy can have two lives : a first life making calculations in a computer, or transforming matter (ore into metal, trees into books), and a second life heating homes.
One needs to build-out quite static-energy-output datacenters (e.g. deploy 10MW at once, and don't grow) for that. Not our business.
But the best is to use no energy at all : see the OLPC project in Afghanistan (A computer with pedals, like the sewing machines of our great-great-great-grand-mothers) (1)
Do you realize that in terms of carbon footprint that is much much less efficient? Look at the title of the thread.
Domas
Teofilo wrote:
You have probably heard about CO2 and the conference being held these days in Copenhagen (1).
You have probably heard about the goal of carbon neutrality at the Wikimania conference in Gdansk in July 2010 (2).
You may want to discuss the basic and perhaps naive wishes I have written down on the strategy wiki about paper consumption (3).
Paper production has a net negative impact on atmospheric CO2 concentration if the wood comes from a sustainably managed forest or plantation. As long as people keep their PediaPress books for a long time, or dispose of them in a way that does not produce methane, then I don't see a problem.
Do we have an idea of the energy consumption related to the online access to a Wikipedia article ? Some people say that a few minutes long search on a search engine costs as much energy as boiling water for a cup of tea : is that story true in the case of Wikipedia (4) ?
No, it is not true, which makes what I'm about to suggest somewhat more affordable.
Given the lack of political will to make deep cuts to greenhouse gas emissions, and the pitiful excuses politicians make for inaction; given the present nature of the debate, where special interests fund campaigns aimed at stalling any progress by appealing to the ignorance of the public; given the nature of the Foundation, an organisation which raises its funds and conducts most of its activities in the richest and most polluting country in the world: I think there is an argument for voluntary reduction of emissions by the Foundation.
I don't mean by buying tree-planting or efficiency offsets, of which I am deeply skeptical. I think the best way for Wikimedia to take action on climate change would be by buying renewable energy certificates (RECs). Buying RECs from new wind and solar electricity generators is a robust way to reduce CO2 emissions, with minimal danger of double-counting, forward-selling, outright fraud, etc., problems which plague the offset industry.
If Domas's figure of 100 kW is correct, then buying a matching number of RECs would be a small portion of our hosting budget. If funding is nevertheless a problem, then we could have a restricted donation drive, and thereby get a clear mandate from our reader community.
Our colocation facilities would not need to do anything, such as changing their electricity provider. We would, however, need monitoring of our total electricity usage, so that we would know how many RECs to buy.
I'm not appealing to the PR benefits here, or to the way this action would promote the climate change cause in general. I'm just saying that as an organisation composed of rational, moral people, Wikimedia has as much responsibility to act as does any other organisation or individual.
Ultimately, the US will need to reduce its per-capita emissions by around 90% by 2050 to have any hope of avoiding catastrophe (see e.g. [1]). Nature doesn't have exemptions or loopholes, we can't continue emitting by moving economic activity from corporations to charities.
[1] http://www.garnautreview.org.au/chp9.htm#tab9_3, and see chapter 4.3 for the impacts of 550 case.
-- Tim Starling
On Mon, Dec 14, 2009 at 1:50 AM, Tim Starling tstarling@wikimedia.org wrote:
I'm not appealing to the PR benefits here, or to the way this action would promote the climate change cause in general. I'm just saying that as an organisation composed of rational, moral people, Wikimedia has as much responsibility to act as does any other organisation or individual.
Even accepting the premise that subsidizing renewable energy is a moral duty, that doesn't mean Wikimedia should fund it, any more than it should be spending its budget on feeding starving children. Wikimedia should not be spending any significant amount of donated money on things that do not directly advance its mission, because people donate to fund its mission, not unrelated causes (however important). It's very different from a private individual or company in this respect -- Wikimedia has a duty to spend its money on the things it's accepting donations for.
(If anyone else wants to spend money on this sort of thing, though, I entirely agree that subsidizing renewable energy makes much more sense than trying to cut power usage. Society is not just going to cut its energy usage by 90% -- the resulting drop in quality of life would probably exceed any caused by global warming. The only way to achieve drastic cuts in CO2 emissions is to stop using fossil fuels for power, and that will only happen when there are economical alternatives. Widespread private subsidization of renewables is a relatively direct and reliable way to help make that happen -- although breakthroughs in fundamental research would obviously be preferable, they're uncertain.)
Aryeh Gregor wrote:
On Mon, Dec 14, 2009 at 1:50 AM, Tim Starling tstarling@wikimedia.org wrote:
I'm not appealing to the PR benefits here, or to the way this action would promote the climate change cause in general. I'm just saying that as an organisation composed of rational, moral people, Wikimedia has as much responsibility to act as does any other organisation or individual.
Even accepting the premise that subsidizing renewable energy is a moral duty, that doesn't mean Wikimedia should fund it, any more than it should be spending its budget on feeding starving children. Wikimedia should not be spending any significant amount of donated money on things that do not directly advance its mission, because people donate to fund its mission, not unrelated causes (however important). It's very different from a private individual or company in this respect -- Wikimedia has a duty to spend its money on the things it's accepting donations for.
While the major program spending that Wikimedia performs should be defined by its mission, I think small spending decisions, relating to day-to-day operations, can be made without recourse to our mission. For instance, the office staff should be able use recycled paper without there being a Board resolution to put it in the mission statement.
In terms of the ethics, there's a big difference between inaction on an issue, say poverty in Africa, and taking direct action in order to make things worse. Wikimedia is not paying people to take food from children's mouths, but it is paying people to burn coal for electricity. I don't think we can claim to be mere bystanders.
-- Tim Starling
On Mon, Dec 14, 2009 at 5:50 AM, Tim Starling tstarling@wikimedia.org wrote:
Aryeh Gregor wrote:
On Mon, Dec 14, 2009 at 1:50 AM, Tim Starling tstarling@wikimedia.org wrote:
I'm not appealing to the PR benefits here, or to the way this action would promote the climate change cause in general. I'm just saying that as an organisation composed of rational, moral people, Wikimedia has as much responsibility to act as does any other organisation or individual.
Even accepting the premise that subsidizing renewable energy is a moral duty, that doesn't mean Wikimedia should fund it, any more than it should be spending its budget on feeding starving children. Wikimedia should not be spending any significant amount of donated money on things that do not directly advance its mission, because people donate to fund its mission, not unrelated causes (however important). It's very different from a private individual or company in this respect -- Wikimedia has a duty to spend its money on the things it's accepting donations for.
While the major program spending that Wikimedia performs should be defined by its mission, I think small spending decisions, relating to day-to-day operations, can be made without recourse to our mission. For instance, the office staff should be able use recycled paper without there being a Board resolution to put it in the mission statement.
In terms of the ethics, there's a big difference between inaction on an issue, say poverty in Africa, and taking direct action in order to make things worse. Wikimedia is not paying people to take food from children's mouths, but it is paying people to burn coal for electricity. I don't think we can claim to be mere bystanders.
I agree with both of you. Funding renewables isn't really a small thing, and so doesn't seem discretionary. At the same time, Wikimedia isn't a bystander, and it does contribute to the problem.
We are a charity distributing a free public good to the world. I don't think it is out of whack with that to want to also act as responsible citizens. So perhaps something like this actually should be in the mission. Would it be crazy to have a board resolution that said, in essence, "Wikimedia should take reasonable and cost-effective steps to reduce or offset its carbon footprint and other impacts on the environment"? Assuming the Board and the executive director can share a similar idea of what is "reasonable" (a few percent of the budget perhaps?), then taking a position like that actually feels like a responsible thing for a thoughtful charity to do.
-Robert Rohde
I personally support any initiative that would reduce energy consumption. I wonder though (in the pure sense of the term, i.e. I have no idea) if the biggest consumption of energy for the Wikimedia Foundation isn't actually travel. Cars consume huge amounts of fossil fuels, and don't get me started on airplanes (I do seem to recall reading somewhere though that the next Wikimania aims to have near zero impact, which is a Good ThingTM).
Александр Дмитрий Alexandr Dmitri
This message and any attachments (the "message") are intended solely for the addressees and is confidential. If you have received this message by mistake, please delete it and immediately inform me by replying to this email address. Any use not in accord with its purpose, any dissemination or disclosure, either whole or partial, is prohibited except after formal approval. The internet can not guarantee the integrity of this message. I can not therefore be held liable for the message if it is modified.
Ce message et toutes les pièces-jointes (ci-après le « message » ou « courrier email ») sont établis a l'intention exclusive de ses destinataires et sont confidentiels. Si vous recevez ce message par erreur, merci de le détruire et de m'en avertir immédiatement en répondant à cette adresse email. Toute utilisation de ce message non conforme à sa destination, toute diffusion ou toute publication, totale ou partielle, est interdite, sauf autorisation expresse. L'internet ne permettant pas d'assurer l'intégrité de ce message, je décline toute responsabilité au titre de ce message, dans l'hypothèse où il aurait été modifié.
On 12/14/2009 05:50 AM, Tim Starling wrote:
In terms of the ethics, there's a big difference between inaction on an issue, say poverty in Africa, and taking direct action in order to make things worse. Wikimedia is not paying people to take food from children's mouths, but it is paying people to burn coal for electricity. I don't think we can claim to be mere bystanders.
I think that's the key distinction here. Our mission is to make the world better in a pretty specific way, and we should stick to that. However, that's not a license to make the world worse in other ways.
For example, when we get rid of old servers, we can't just dump the toxic components in the nearest river, even if that's cheaper. We have to dispose of them responsibly, even if polluting is nominally better for our mission. The same principle would seem to apply to the CO2 we currently emit. The tricky part is the extent to which it's practical for us alone to take action, as opposed to waiting for society to catch up.
Assuming Domas's number (which seems ballpark correct) and the numbers in our article on green tags, we'd be looking at an expense of circa $20k/yr. That's real money, but at 4% of our hosting budget, it doesn't seem crazy. There are definitely a lot of thorny questions about the quality of the tags, so good ones could be more, but perhaps not much more.
If we get interested in this, I know an expert in the field, and I'm glad to put someone at the foundation in touch.
William
On Mon, Dec 14, 2009 at 8:50 AM, Tim Starling tstarling@wikimedia.org wrote:
While the major program spending that Wikimedia performs should be defined by its mission, I think small spending decisions, relating to day-to-day operations, can be made without recourse to our mission. For instance, the office staff should be able use recycled paper without there being a Board resolution to put it in the mission statement.
If the sums we're discussing are so small that they can be reasonably compared to the difference between using recycled and regular paper, I don't think they're worth spending much time or effort on either way. How much money would we be talking about to offset Wikimania alone?
In terms of the ethics, there's a big difference between inaction on an issue, say poverty in Africa, and taking direct action in order to make things worse. Wikimedia is not paying people to take food from children's mouths, but it is paying people to burn coal for electricity. I don't think we can claim to be mere bystanders.
By that logic, Wikimedia is actively supporting war (or whatever other government policy you dislike) by withholding income tax from its employees' paychecks to give to the US government. Sure, it has no real choice about paying taxes; but it has no real choice about using electricity, either. If using electricity makes you personally responsible for funding renewable energy, why doesn't paying taxes make you personally responsible for funding antiwar organizations?
Of course, paying taxes funds war in a very direct way. The money goes to the feds and then straight to the military, where a large fraction is immediately spent on guns and bombs, which are possibly used to kill people within a year or two. In contrast, by emitting carbon dioxide, you're contributing to an effect that won't be a big deal for at least a few more decades. And that will probably become no big deal again a few decades after that when everyone's adapted to it. And that won't directly kill anyone in any event, mainly just cause economic harm. And that might not happen anyway if some clever soul comes up with a good enough fossil fuel replacement at any point in the next thirty years. Or if it becomes economical to pump greenhouse gases out of the atmosphere. Or if some cheap scheme is devised to reduce warming some other way, like releasing particles to block sunlight. Or if some unforeseen negative feedback causes warming to not get too bad after all. And of course maybe we've already hit a critical threshold and cutting emissions is pointless by now.
Plus you can add the fact that Wikimedia's contribution to the affair isn't likely to be even measurable, especially if the major damage is from catastrophic changes (e.g., ice caps melting) rather than incremental ones. How much money do you owe for increasing mean global temperature by a billionth of a degree fifty years from now?
All in all, I'd say Wikimedia has a lot more culpability for people being shot.
Aryeh Gregor wrote:
In contrast, by emitting carbon dioxide, you're contributing to an effect that won't be a big deal for at least a few more decades.
It's a big deal already, and by the time it becomes an even bigger deal, it will be too late to act. The global climate takes decades to respond to changes in forcing factors. Even if we stopped all greenhouse gas emissions now, the earth would continue to warm for decades because the heat capacity of the ocean slows down the lower atmosphere's response to increased radiation.
And that will probably become no big deal again a few decades after that when everyone's adapted to it.
Increased temperatures will cause a drop in rainfall and thus a reduction in food generating capacity in Australia, the Mediterranean, Mexico, and north-west and south-west Africa. High temperatures also damage crops directly. In the no-mitigation case, the Garnaut Review (which I've recently been reading and linked to earlier) projects a loss of half of Australia's agricultural capacity by around 2050.
Also in Australia, species will be lost as cooler mountain habitats disappear from the continent, the Great Barrier Reef will be destroyed, and significant freshwater coastal wetlands will be inundated by the sea.
And that won't directly kill anyone in any event, mainly just cause economic harm.
The World Health Organisation disagrees:
http://www.who.int/mediacentre/factsheets/fs266/en/ http://whqlibdoc.who.int/publications/2007/9789241595674_eng.pdf
You just sound gullible when you recycle such claims without showing any awareness the opposing viewpoint.
And that might not happen anyway if some clever soul comes up with a good enough fossil fuel replacement at any point in the next thirty years.
Like what? Nuclear fusion? Talk about pie in the sky.
Or if it becomes economical to pump greenhouse gases out of the atmosphere.
The Garnaut Review suggests that it may well become economical in a few decades, but only because mandatory targets will raise the price of carbon to several times its current value. This will happen when cheaper measures, like shutting down fossil fuel power stations, are exhausted. Economical doesn't mean cheap.
Or if some cheap scheme is devised to reduce warming some other way, like releasing particles to block sunlight.
And cause famine due to a reduction in tropical rainfall?
Or if some unforeseen negative feedback causes warming to not get too bad after all.
The other side of that probability distribution, of course, is that positive feedback will cause it to be even worse than the high-end IPCC predictions and that the sea level will rise by tens of metres. There are studies on which of these two outcomes is more likely. Some of us do not want to roll the dice.
And of course maybe we've already hit a critical threshold and cutting emissions is pointless by now.
There isn't such a threshold. The more you emit, the hotter it gets. As the temperature rises, the outcomes for both humans and for biodiversity become steadily worse.
Plus you can add the fact that Wikimedia's contribution to the affair isn't likely to be even measurable, especially if the major damage is from catastrophic changes (e.g., ice caps melting) rather than incremental ones. How much money do you owe for increasing mean global temperature by a billionth of a degree fifty years from now?
The cost per capita can be derived from the total cost using a complex mathematical process known as "division". Maybe you've heard of it?
-- Tim Starling
2009/12/15 Tim Starling tstarling@wikimedia.org:
Aryeh Gregor wrote:
In contrast, by emitting carbon dioxide, you're contributing to an effect that won't be a big deal for at least a few more decades.
It's a big deal already, and by the time it becomes an even bigger deal, it will be too late to act. The global climate takes decades to respond to changes in forcing factors. Even if we stopped all greenhouse gas emissions now, the earth would continue to warm for decades because the heat capacity of the ocean slows down the lower atmosphere's response to increased radiation.
I commend to you all John Birmingham's take:
http://www.theage.com.au/opinion/blogs/blunt-instrument/the-sky-is-falling--...
(huh, so Blunt Instrument runs nationally? Good stuff!)
- d.
On Mon, Dec 14, 2009 at 8:13 PM, Tim Starling tstarling@wikimedia.org wrote:
It's a big deal already, and by the time it becomes an even bigger deal, it will be too late to act. The global climate takes decades to respond to changes in forcing factors. Even if we stopped all greenhouse gas emissions now, the earth would continue to warm for decades because the heat capacity of the ocean slows down the lower atmosphere's response to increased radiation.
Then we agree that cutting greenhouse gases is not a very effective solution?
The World Health Organisation disagrees:
http://www.who.int/mediacentre/factsheets/fs266/en/ http://whqlibdoc.who.int/publications/2007/9789241595674_eng.pdf
I said "directly". Militaries kill people directly. Global warming kills people indirectly.
You just sound gullible when you recycle such claims without showing any awareness the opposing viewpoint.
I don't think I'm recycling claims. I have a fairly unusual view on global warming, actually.
Like what? Nuclear fusion? Talk about pie in the sky.
Or just more effective photovoltaic cells. Or, well, anything other than fossil fuels. Solar and wind power, for instance, are much more viable now than they were thirty years ago. Wikipedia says global photovoltaic power production was 500 kW in 1977. It's not a stretch to suppose that they or other energy sources will be much more viable thirty years from now. In fact, it would be very surprising if we didn't have much better alternatives to fossil fuels by then than we have now.
And cause famine due to a reduction in tropical rainfall?
Sure, maybe. Maybe not. Everything has costs and benefits. Blocking sunlight is a scheme that can be deployed very quickly and cheaply, and could not just completely stop future warming, but reverse warming that's already occurred before deployment. Cutting CO2 is immensely more expensive, slower, and less effective. You were just telling me how cutting carbon will never stop warming, and many people will die to famine if warming doesn't stop. Doesn't that imply people will die of famine either way? The costs need to be weighed against the benefits.
Of course, the experts at large-scale cost-benefit analysis are economists, not climatologists. One panel of economists that set out to systematically examine the issue based on data provided by climatologists is the Copenhagen Consensus:
http://en.wikipedia.org/wiki/Copenhagen_Consensus http://fixtheclimate.com/
The Copenhagen Consensus' Climate Change Project asked a panel of five economists (three of them Nobel laureates) to consider the costs and benefits of various schemes to mitigate or prevent global warming. They took climatologists' predictions for granted, and all agreed that anthropogenic global warming is occurring. The number one solution was to reflect more sunlight (by cloud whitening). Seven of the fifteen schemes involved carbon-cutting; they placed at positions nine through fifteen.
The Copenhagen Consensus was and is controversial, of course. But the issue is far from open-and-shut. Even if cutting GHG emission is part of the solution, it's not at all clear that it makes sense to spend money on it now, rather than invest in alternative energy so we can make larger-scale cuts later.
Are you aware of any groups of experts that have done a systematic cost-benefit analysis on the various options, and reached opposite conclusions to the Copenhagen Consensus? "Experts" here means, say, economists, not climatologists. (And preferably not political appointees either.) Climatologists are experts at predicting climate outcomes, not evaluating the quality-of-life effects of those outcomes. They have no expertise in that. Economics is the discipline concerned with welfare assessment.
By the way, you didn't actually address the point of my last post. If involuntarily releasing greenhouse gases creates a moral obligation to undo the harm caused by that, why doesn't involuntarily paying taxes create the same moral obligation? This is independent of whether cutting GHGs is actually effective (which isn't something I meant to get into, but oh well).
Might I suggest that we're getting a bit off-track here with these broad debates on climate change issues?
I think if we're considering spending $20k/yr on environmental initiatives, then the most effective way for us and the path most in line with Wikimedia's core mission would be to spend that money directly on special efforts to increase high-quality free content about environmental topics on Wikipedia and the other projects.
Thanks, Pharos
On Tue, Dec 15, 2009 at 11:04 AM, Aryeh Gregor Simetrical+wikilist@gmail.com wrote:
On Mon, Dec 14, 2009 at 8:13 PM, Tim Starling tstarling@wikimedia.org wrote:
It's a big deal already, and by the time it becomes an even bigger deal, it will be too late to act. The global climate takes decades to respond to changes in forcing factors. Even if we stopped all greenhouse gas emissions now, the earth would continue to warm for decades because the heat capacity of the ocean slows down the lower atmosphere's response to increased radiation.
Then we agree that cutting greenhouse gases is not a very effective solution?
The World Health Organisation disagrees:
http://www.who.int/mediacentre/factsheets/fs266/en/ http://whqlibdoc.who.int/publications/2007/9789241595674_eng.pdf
I said "directly". Militaries kill people directly. Global warming kills people indirectly.
You just sound gullible when you recycle such claims without showing any awareness the opposing viewpoint.
I don't think I'm recycling claims. I have a fairly unusual view on global warming, actually.
Like what? Nuclear fusion? Talk about pie in the sky.
Or just more effective photovoltaic cells. Or, well, anything other than fossil fuels. Solar and wind power, for instance, are much more viable now than they were thirty years ago. Wikipedia says global photovoltaic power production was 500 kW in 1977. It's not a stretch to suppose that they or other energy sources will be much more viable thirty years from now. In fact, it would be very surprising if we didn't have much better alternatives to fossil fuels by then than we have now.
And cause famine due to a reduction in tropical rainfall?
Sure, maybe. Maybe not. Everything has costs and benefits. Blocking sunlight is a scheme that can be deployed very quickly and cheaply, and could not just completely stop future warming, but reverse warming that's already occurred before deployment. Cutting CO2 is immensely more expensive, slower, and less effective. You were just telling me how cutting carbon will never stop warming, and many people will die to famine if warming doesn't stop. Doesn't that imply people will die of famine either way? The costs need to be weighed against the benefits.
Of course, the experts at large-scale cost-benefit analysis are economists, not climatologists. One panel of economists that set out to systematically examine the issue based on data provided by climatologists is the Copenhagen Consensus:
http://en.wikipedia.org/wiki/Copenhagen_Consensus http://fixtheclimate.com/
The Copenhagen Consensus' Climate Change Project asked a panel of five economists (three of them Nobel laureates) to consider the costs and benefits of various schemes to mitigate or prevent global warming. They took climatologists' predictions for granted, and all agreed that anthropogenic global warming is occurring. The number one solution was to reflect more sunlight (by cloud whitening). Seven of the fifteen schemes involved carbon-cutting; they placed at positions nine through fifteen.
The Copenhagen Consensus was and is controversial, of course. But the issue is far from open-and-shut. Even if cutting GHG emission is part of the solution, it's not at all clear that it makes sense to spend money on it now, rather than invest in alternative energy so we can make larger-scale cuts later.
Are you aware of any groups of experts that have done a systematic cost-benefit analysis on the various options, and reached opposite conclusions to the Copenhagen Consensus? "Experts" here means, say, economists, not climatologists. (And preferably not political appointees either.) Climatologists are experts at predicting climate outcomes, not evaluating the quality-of-life effects of those outcomes. They have no expertise in that. Economics is the discipline concerned with welfare assessment.
By the way, you didn't actually address the point of my last post. If involuntarily releasing greenhouse gases creates a moral obligation to undo the harm caused by that, why doesn't involuntarily paying taxes create the same moral obligation? This is independent of whether cutting GHGs is actually effective (which isn't something I meant to get into, but oh well).
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Aryeh Gregor wrote:
I said "directly". Militaries kill people directly. Global warming kills people indirectly.
I'll take my reply offlist. I have a blog post at tstarling.com where I've been canvassing this issue, I think that would be a better home for this debate than private email, since other people will be able to read it and comment.
-- Tim Starling
I strongly encourage those who are interested in this to create a proposal for strategic planning consideration... Http://strategy.Wikimedia.org .
The strategic planning initiative is thinking about the wmf's next five years... This type of conversation is very welcome there.
---------------------------- Philippe Beaudette philippe@wikimedia.org
On Dec 14, 2009, at 12:50 AM, Tim Starling tstarling@wikimedia.org wrote:
Teofilo wrote:
You have probably heard about CO2 and the conference being held these days in Copenhagen (1).
You have probably heard about the goal of carbon neutrality at the Wikimania conference in Gdansk in July 2010 (2).
You may want to discuss the basic and perhaps naive wishes I have written down on the strategy wiki about paper consumption (3).
Paper production has a net negative impact on atmospheric CO2 concentration if the wood comes from a sustainably managed forest or plantation. As long as people keep their PediaPress books for a long time, or dispose of them in a way that does not produce methane, then I don't see a problem.
Do we have an idea of the energy consumption related to the online access to a Wikipedia article ? Some people say that a few minutes long search on a search engine costs as much energy as boiling water for a cup of tea : is that story true in the case of Wikipedia (4) ?
No, it is not true, which makes what I'm about to suggest somewhat more affordable.
Given the lack of political will to make deep cuts to greenhouse gas emissions, and the pitiful excuses politicians make for inaction; given the present nature of the debate, where special interests fund campaigns aimed at stalling any progress by appealing to the ignorance of the public; given the nature of the Foundation, an organisation which raises its funds and conducts most of its activities in the richest and most polluting country in the world: I think there is an argument for voluntary reduction of emissions by the Foundation.
I don't mean by buying tree-planting or efficiency offsets, of which I am deeply skeptical. I think the best way for Wikimedia to take action on climate change would be by buying renewable energy certificates (RECs). Buying RECs from new wind and solar electricity generators is a robust way to reduce CO2 emissions, with minimal danger of double-counting, forward-selling, outright fraud, etc., problems which plague the offset industry.
If Domas's figure of 100 kW is correct, then buying a matching number of RECs would be a small portion of our hosting budget. If funding is nevertheless a problem, then we could have a restricted donation drive, and thereby get a clear mandate from our reader community.
Our colocation facilities would not need to do anything, such as changing their electricity provider. We would, however, need monitoring of our total electricity usage, so that we would know how many RECs to buy.
I'm not appealing to the PR benefits here, or to the way this action would promote the climate change cause in general. I'm just saying that as an organisation composed of rational, moral people, Wikimedia has as much responsibility to act as does any other organisation or individual.
Ultimately, the US will need to reduce its per-capita emissions by around 90% by 2050 to have any hope of avoiding catastrophe (see e.g. [1]). Nature doesn't have exemptions or loopholes, we can't continue emitting by moving economic activity from corporations to charities.
[1] http://www.garnautreview.org.au/chp9.htm#tab9_3, and see chapter 4.3 for the impacts of 550 case.
-- Tim Starling
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Sat, Dec 12, 2009 at 5:32 PM, Teofilo teofilowiki@gmail.com wrote:
How about moving the servers (5) from Florida to a cold country (Alaska, Canada, Finland, Russia) so that they can be used to heat offices or homes ? It might not be unrealistic as one may read such things as "the solution was to provide nearby homes with our waste heat" (6).
Or Switzerland not only because it's a cold country but in Switzerland it's already in place the idea to use "green energy" with a small additional cost.
In this case the power supplier assure that this energy is produced with zero CO2 emission (i.e. hydroelectric energy).
In my case (I am IT manager) I have provided my data center with a system of air conditioned with "free cooling", in this case when the external temperature is lower than 17 °C, the system of air conditioned is supplied with external air without consumption of energy.
I have the energy costs reduced of 40% (my location in Switzerland has less than 17 °C at least for 50% of total days because the nights in Switzerland are cool). It could be 50% but I reuse the 10% to have "green energy".
In any case the total amount is more than 50% of savings because the hot air is addressed in the offices (only during the Winter and Autumn) and the maintenance of system of air conditioned is drastically reduced with less problem of damage.
At start it's a big cost to have a system of free coling, but after two or three years it's already refunded with the saved money.
Ilario
wikimedia-l@lists.wikimedia.org