Richard Smith: Get with Web 2.0 or become yesterday’s person

Richard Smith Web 2.0—the social web—has the potential to improve global health greatly and to solve complex problems in health science—as it has already done in particle physics. I heard this message at a conference on global health in Geneva last week, but I also heard that the barriers to these potential achievements are social and cultural, not technological. The machines we can fix. It’s the people—particularly old timers (that’s anybody over 40, I’m 56)—that are the problem.

I suspect that most readers of the BMJ couldn’t tell you much about Web 2.0—despite the journal having had several editorials on the subject. The essence of Web 2.0 is that it’s bottom up and participative: it’s created by the many not the few.

Most websites are firmly Web 1.0—a few wise guys attempting to inform the many. has its Web 2.0 components (blogs and rapid responses), but mostly it’s potentates pushing stuff that they try and pass off as wisdom.

Web 2.0 can also be understood by listing its components—RSS feeds, blogs, wikis, podcasts, social networking sites, mash up technology, etc—but the only way to really understand Web 2.0, as to understand anything, is to jump in and start using it.

As an evangelist for Web 2.0 attracted to it by its anarchic, democratising, iconoclastic potential, I urge friends to start using Facebook. Most are reluctant. “My identity will be stolen.” “I don’t want to waste time.” “My children will be horrified.” “I don’t want my life to be exposed: it’ll be like being on Big Brother.”

My friends can chose to stay living in the old world, but I think that they are making a big mistake—because the appearance of Web 2.0 seems to me a crucial step in entering the information age, more crucial even than the invention of the world wide web.

The invention of the web was an essential technical development, but it’s the social and cultural change that will have far more impact.

Ten years ago we used to compare the early days of the web with the early days of film—when directors used one camera fixed on actors on a stage: it was theatre filmed.

Later came outside locations, cutaways, rapid scene switches, stuntmen, car chases, animation, and all the techniques we take for granted.

We couldn’t see ten years ago how the web would move by analogy from filmed theatre to films, but now we are beginning to see. We also begin to see how Web 2.0 can help solve the really big problems—like the gross inequalities in the world between rich and poor.

At the conference in Geneva we heard about the Peoples’ Uni , which aims to provide free education for health workers in the poor world.

Many cannot afford the fees of the distance learning courses of Northern universities, and so a group from Edinburgh has started free learning based on using open access material.

The amount of such material is increasing rapidly, but its educational value is increased by adding teaching and assessment based on the material.

That’s what the People’s University is doing. The university is still a bit Web 1.0, confessed Richard Heller, co-ordinator of Peoples’ Uni. It’ll be more Web 2.0 when the students set the curriculum and do the assessment. That will come.

Particle physicists have been among the first to grasp the potential of Web 2.0 to solve previously insoluble scientific problems by linking computers around the world in the Atlas project. E-science is not about technology but about global interaction: it’s a social not a technological development. Biologists are using Web 2.0 capabilities to unravel the human genome, but we’ve been slower in health to exploit the possibilities.

And our tardiness is social. The Global Fund for responding to AIDS, TB, and malaria has tried to use Web 2.0 through My global fund at work, but the response has been slow.

The reasons, said the manager of the project, are generational, educational, cultural, linguistic, and psychological.

Many health workers come from very respectful cultures where the young do not criticize the old. For many people it’s a huge and terrifying step to press a button and potentially allow the whole world to read your words.

Doctors, I fear, are too fond of a top down world—because they are usually at the top. But that top down world is crumbling. Think of Nicolae Ceauşescu’s statue being hauled down and smashed. That’s the old world of Web 1.0. Get with Web 2.0 in a serious way or become a yesterday’s person.

Richard Smith is director of the Ovations Chronic Disease Initiative, editor of Cases Journal, and a member of the board of the Public Library of Science. In the past he’s been the editor of the BMJ, chief executive of the BMJ Publishing Group and UnitedHealth Europe, a doctor of sorts, and a television doctor. He loves making soup, trouble, and marmalade.

Competing interest: Richard Smith is the editor of Cases Journal, which has claims to be a Web 2.0 journal. He’s also a Facebook fanatic.

  • I agree with you Richard.
    Life is ceaseless change till death.
    If we do not change we will be left behind and the stream flows on!

  • John Sandars

    Last year we surveyed the use of a range of Web 2.0 tools by doctors and students [1]. As expected, students were the highest users but we were surprised by the enthusiasm of many older doctors. This group could appreciate the value of Web 2.0 as a useful educational resource but they readily admitted that they lacked the knowledge and skills to make the most effective use of this new approach to teaching and learning.

    We recommend that there are structured teaching opportunities to develop a new set of skills that are required for Web 2.0 learning, perhaps as part of CPD programmes. Doctors require a new type of digital literacy and the essential competencies will include how to access Web 2.0 resources, how to make appropriate choices and how to ensure good online etiquette.

    John Sandars, Senior Lecturer and Academic lead for e-learning, Medical Education Unit, The University of Leeds, Leeds, UK
    Sara Schroter, Senior Researcher, BMJ Editorial, BMA House, Tavistock Square, London, UK

    [1] Web 2.0 technologies for undergraduate and postgraduate medical education: an online survey Sandars J, Schroter S
    Postgraduate Medical Journal 2007;83:759-762

  • John Thompson

    Medical uses for Web 2.0? What about an online equivalent of the House television show in which Internet users are given medical cases to solve/diagnose? Instead of just one or a few doctors in one hospital trying to diagnose a condition and prescribe a response, open up the case to the world and let everyone virtually peer over the shoulders of the primary physician so persons worldwide can offer their ideas.

  • A very thought-provoking piece, Richard. I personally think it is clear that the loose grouping of technologies known as Web 2.0 is going to have a huge long term impact (although I hate the name ‘Web 2.0’). Yet at this point I would assert that it is too early to tell what the real benefits will be. Wikipedia is undeniably an incredible ‘citizen created’ resource, yet the ongoing soap opera amongst the inner circle who control it is reminiscent of Lord of the Flies. Some believe that Facebook has peaked, and will now go the way of Friends Reunited. Google, YouTube, MySpace, Facebook are all incredibly young organisations. The rapid evolutions that brought them to the fore could just as easily see them disappear. To that end, I would like to draw your attention to the concept of the Hype Cycle ( Ironically, while you are talking Web 2.0 up as a ‘must have’, my job in the British Computer Society brings me into contact with many IT people (security people in particular) who are in the doldrums of the ‘trough of disillusionment’. That does not mean the technologies aren’t useful, but that we’ve still got some thrashing about before things settle down…then on to The Next Big Thing. None of that undermines your assertion that the medical community need to embrace this new technology – and I think the idea of sharing high quality knowledge into the poorer world has enormous potential. What I am suggesting is that an enthusiastic embrace needs to be tempered with a scepticism that tests for real results and a stoicism to keep driving onward. I was in my late teens when the web was invented, and I’m in my early thirties now. On all fronts it is too early to tell…

  • Richard Bartley

    I think the term Web 2.0 is misleading. It is not a technical upgrade of the web, but simply a new way of using what already exists . The inventor of the WWW, Tim Berners-Lee, has questioned whether one can use the term in a meaningful way, since many of the technology components of “Web 2.0” have existed since the early days of the Web.

    I tried Facebook and it left me unimpressed and slightly bewildered. Perhaps I am the wrong demographic.

    The best work related social-networking for me is the buffet at courses/conferences, when I can exchange opinions with other attendees.

  • I hope you don’t mind, Richard, but I was inspired to take up the topic on my BCS blog here –

  • Nnamdi Udezue

    This was a very interesting article Richard.

    Human communication has always been about people and social capital. The value will come from tools that boost social capital and enhance serendipity between health professionals and enable us to approach health problems with a network effect multiplier.

    Web 2.0 is a shift from the core (pushed information and passive consumers) to the edge (people connecting to one another), on a scale that would be impossible to achieve any other way. The web 2.0 term may be part of short term hype, as noted by David Evans, but I believe it will have long lasting effects.

    The most important word on the web may well be share. We need to be willing to experiment to make the most of the world that we are in, and to be willing to broaden our perspectives, and look at the world as what it could be, rather than what it is.

  • To clarify slightly, my reference to the ‘hype cycle’ does not infer that I believe web 2.0 to be useless – far from it. Instead, it describes a seemingly-inevitable cycle of expectations as new technology is introduced. The lesson is not that new technology should be ignored, but that it will take a while to establish the real long term benefit (that may not be what was originally forecast), with ups and downs on the way. The real visionary needs to hang on in there, and never lose sight of the real impact.

  • Mary E Black

    In parallel with Fiona Godlee’s blog asking “where are all the women?”… we have a blog on web 2.0 technology with only male discussants. ( I think)

    I have several professional hats right now (as well as doctor, writer and international health consultant) and one is part owner of a web consulting technology company in Serbia. Our workforce is 50% female, a record for most companies in the sector. They are there because I actively recruit them. The mixture creates a different corporate culture.

    Richard is right – Web 2.0 is unavoidable. Women are huge users of these servers, but unfortunately the pattern we see in medicine is repeated here – a minority of female leaders/writers/voices on the subject.

    This is a real loss to the debate.

  • Michael Kesler

    I enjoyed the initial comment and the following blog interaction. I have heard that there will be a “faster” internet service which I thought was being called Web 2. It was or is used by government agencies at first. To discover that Web 2 now means a variety of means of communicating on the Web is an eye opener. I look forward to learning more about this approach.

  • I agree with the article and believe WEB 2.0 is changing everything in the way we learn, and is touching those with learning disabilities by offering various formats with the same information, along with reaching out to those that have no financial ability to ever attend college. “Istant gratification of knowledge is becoming a way of life for upcoming generations, and they latch on to this tool and utilize it to the nth degree. We baby-boomers must learn from them so we can keep up, as we will be living and functioning citizens on earth, longer than ever before, according to statistics.
    In the article “How Web 2.0 is Changing Medicine” by Dean Guistini(website: ) I read the comment “Web 2.0 may be one of the most influential technologies in the history of publishing” – this is truly an understatement. We have more access to more information than anytime in history. My only concern is the accuracy of the information that is put out there. In most blogs, ANYONE can input information and sign their name as M.D., PhD., etc. so I caution those depending on web sources to find and use only reputable websites. Perhaps a universal format will be in place one day with verified content on one navigation bar and unverified content a second navigation bar.

  • Ann

    Richard writes that “the barriers to these potential achievements are social and cultural, not technological. The machines we can fix. It’s the people—particularly old timers (that’s anybody over 40, I’m 56)—that are the problem.” I don’t want to be “part of the problem,” so I’m excited about the opportunities to learn about and use Web 2.0 and possibly even join Facebook!

  • Quite interesting and exciting. I am looking forward to truely “understanding” Web 2.0 by jumping in although I can relate to the feelings, “it’s a huge and terrifying step to press a button and potentially allow the whole world to read your words.”

  • Sulyman

    You are right to a large extent. respect for seniors is ingrained not only in medical practice but also in cultures of most developing countries. However, the only thing that appears constant is change. Change in ways of doing and viewing things to achieve better result.
    ( publising within 48hrs).

    Sulyman Zubair