What is Cybermetrics?

Cybermetrics, or Webometrics, is mainly concerned with measuring aspects of the Web: web sites, web pages, parts of web pages, words in web pages, hyperlinks, web search engines. The importance of the web itself as a communication medium and as a host to an increasingly wide array of documents, from journal articles to holiday brochures, needs no introduction. Given this huge and easily accessible source of information there are limitless possibilities for measuring or counting things on a huge scale (e.g., the number of web sites, the number of web pages, the number of blogs) or on a smaller scale (e.g., the number of web sites in Ireland, the number of web pages in the CNN web site, the number of blogs mentioning Barack Obama prior to the 2008 presidential campaign).

Webometrics is useful for social scientists with research topics that are wholly or partly online (e.g., social networks, news, political communication), social scientists with offline research topics that may have an online reflection even if this is not a core component (e.g., diaspora communities, consumer culture, linguistic change).

Webometrics has many commercial applications, such as evaluating the online impact of marketing campaigns and reports (e.g., by think-tanks and universities), and identifying customer opinions and changes in customer opinions.

About the Statistical Cybermetrics Research Group

To visit us, see the travel information page.

The statistical cybermetrics research group, founded at the end of 2000, specialises in the downloading and analysis of data on a large scale from the web. We have participated in a number of research projects, mainly funded by the EU, and also undertake private contracts to evaluate web sites and to informally evaluate the publications of research or media-based organisations. We also produce a number of programs, databases and other information, most of which is made freely available to researchers. Much of our work ends up being reported in our publications.

Some examples of our programs...

LexiURL Searcher is a program that communicates with the web and blog search engines in order to submit large numbers of searches. It also has extra capabilities to mass-produce the searches and analyse the results.

The Main LexiURL Searcher screen

LexiURL is a program that processes web link information and produces a set of standard reports.

The LexiURL list of standard reports

SocSciBot is a program that crawls web sites, saves all the pages and extracts their hyperlinks. Large sets of link structures of universities are also available online.

The SocSciBot crawling screen

Along with SocSciBot, SocSciBot Tools analyses the links in the data extracted from SocSciBot.

SocSciBot Tools -showing some reports listed

Also along with SocSciBot, SocSciBot Cyclist analyses the text in the pages downloaded by SocSciBot.

cyclist screen

SocSciBot Network is a program that produces network diagrams of SocSciBot link data.

SocSciBot Network

Mozdeh RSS Analyser is a program that downloads blog and other RSS feeds and conducts text-based analyses, for example to identify peaks in discussion of a given broad topic.

Mozdeh RSS Analyser - A time series graph

RESCAR - a special purpose program to aid the human gathering specific web data from specific web pages, supported by (manual) Google searches.

RESCAR with data on the left and Google on the right