Website: neubot.github.io Status: concluded Period: June 2008 – June 2017 Funding: About 112.000€ in the 2012-2017 period Funding organization: Regione Piemonte (research grant), Measurement Lab/Google (two distinct unrestricted gifts) Person(s) in charge: Simone Basso and Antonio Langiu
We commit to write open source software and to publish the data collected by our software as open data. We wrote and now maintain the neubotsoftware. We actively contribute to measurement-kit, ooniprobe-android, andooniprobe-ios.
Lacking Nexa-side resources to maintain it, Neubot will be frozen after May 31, 2017. After this date, Simone Basso, the main developer since 2010, will maintain the software tools originated during this research project.
Background
Neubot started in 2008 to study network neutrality. That is, the principle that the Internet should treat all packets equally, without any discrimination dependent on the protocol, the application or the source.
The scientific interest for network neutrality was motivated by the desire to understand this key characteristic of the internet that, according to many scholars, is a precondition for an open, generative Internet.
The seminal paper by De Martin and Gloriosodescribed the conceptual architecture of Neubot. The first prototype of the neubot software was written by Mr. Gianluigi Pignatari Ardila. Simone Basso joined the Nexa Center in 2010 and continued the development during his doctorate on network measurements under the supervision of prof. De Martin.
Since 2014 the scope of Neubot (as a project) was expanded to also encompass Internet-censorship measurements activities. This happened during the MORFEO subproject of which theOONI projectwas also partner.
Objectives
1. Divide the server and client side components of the neubot software. We have renounced to the idea of running peer-to-peer tests, therefore, it is more rational to manage client and server separately.
2. Sketch out a full plan for delivering the final version of Neubot as the scientific branch of the project as run by the Nexa Center is about the be closed.
3. Develop multi-server network tests. The idea of this objective is to make peer-to-peer network tests more similar to real traffic
3. Work on data visualization and analysis. This calls for making data visualization and analysis based on Neubot data
4. Write uTP module for measurement-kit. This will be the precondition to run uTP network performance measurements, which is interesting to investigate BitTorrent discrimination.
5. Contribute to OONI iOS and Android apps. This means adding measurement-kit side support and helping with testing and reviewing the apps.
6. Finalize the research paper on Neubot. There is currently a work-in-progress comprehensive research paper that describes Neubot as a whole.
7. Finalize the research paper that follows up the NNTools2015 workshop. This shall describe the ideas that brought us to implement the measurement-kit library.
Within end of May, we will try also to write a plan for finalizing Neubot (objective 2) and to move forward the papers (objectives 6 and 7). It is unlikely we will be able to pursue any other objective, given the limited amount of time left, and given the need to wrap up what was done since 2010.
Related Publications
2014
Simone Basso; Antonio Servetti; Enrico Masala; Juan Carlos De Martin
@conference{<LineBreak> 11583_2551540,
title = {Measuring DASH Streaming Performance from the End Users Perspective using Neubot},
author = {Simone Basso and Antonio Servetti and Enrico Masala and Juan Carlos De Martin},
url = {https://nexa.polito.it/wp-content/uploads/2024/06/basso2014measuring.pdf},
doi = {10.1145/2557642.2563671},
year = {2014},
date = {2014-03-21},
urldate = {2014-01-01},
booktitle = {Proceedings of the 5th ACM Multimedia Systems Conference},
pages = {1–6},
publisher = {ACM},
address = {New York},
abstract = {The popularity of DASH streaming is rapidly increasing and a number of commercial streaming services are adopting this new standard. While the benefits of building streaming services on top of the HTTP protocol are clear, further work is still necessary to evaluate and enhance the system performance from the perspective of the end user. Here we present a novel framework to evaluate the performance of rate-adaptation algorithms for DASH streaming using network measurements collected from more than a thousand Internet clients. Data, which have been made publicly available, are collected by a DASH module built on top of Neubot, an open source tool for the collection of network measurements. Some examples about the possible usage of the collected data are given, ranging from simple analysis and performance comparisons of download speeds to the performance simulation of alternative adaptation strategies using, e.g., the instantaneous available bandwidth values.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
The popularity of DASH streaming is rapidly increasing and a number of commercial streaming services are adopting this new standard. While the benefits of building streaming services on top of the HTTP protocol are clear, further work is still necessary to evaluate and enhance the system performance from the perspective of the end user. Here we present a novel framework to evaluate the performance of rate-adaptation algorithms for DASH streaming using network measurements collected from more than a thousand Internet clients. Data, which have been made publicly available, are collected by a DASH module built on top of Neubot, an open source tool for the collection of network measurements. Some examples about the possible usage of the collected data are given, ranging from simple analysis and performance comparisons of download speeds to the performance simulation of alternative adaptation strategies using, e.g., the instantaneous available bandwidth values.
@article{<LineBreak> 11583_2516488,
title = {The NeuViz Data Visualization Tool for Visualizing Internet-Measurements Data},
author = {Giuseppe Futia and Zimuel Enrico and Simone Basso and Juan Carlos De Martin},
url = {http://mondodigitale.aicanet.net/2014-1/internet_cloud_e_web_domain/05_FUTIA.pdf},
year = {2014},
date = {2014-03-03},
urldate = {2014-01-01},
journal = {Mondo Digitale},
number = {49},
publisher = {AICA - Associazione italiana per l'informatica ed il calcolo distribuito},
abstract = {In this paper we present NeuViz, a data processing and visualization architecture for network measurement experiments. NeuViz has been tailored to work on the data produced by Neubot (Net Neutrality Bot), an Internet bot that performs periodic, active network performance tests. We show that NeuViz is an effective tool to navigate Neubot data to identify cases (to be investigated with more specific network tests) in which a protocol seems discriminated. Also, we suggest how the information provided by the NeuViz Web API can help to automatically detect cases in which a protocol seems discriminated, to raise warnings or trigger more specific tests.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
In this paper we present NeuViz, a data processing and visualization architecture for network measurement experiments. NeuViz has been tailored to work on the data produced by Neubot (Net Neutrality Bot), an Internet bot that performs periodic, active network performance tests. We show that NeuViz is an effective tool to navigate Neubot data to identify cases (to be investigated with more specific network tests) in which a protocol seems discriminated. Also, we suggest how the information provided by the NeuViz Web API can help to automatically detect cases in which a protocol seems discriminated, to raise warnings or trigger more specific tests.
@conference{<LineBreak> 11583_2516321,
title = {Visualizing Internet-Measurements Data for Research Purposes: the NeuViz Data Visualization Tool},
author = {Giuseppe Futia and Zimuel Enrico and Simone Basso and Juan Carlos De Martin},
url = {https://nexa.polito.it/wp-content/uploads/2024/06/futia2013visualizing_0.pdf},
year = {2013},
date = {2013-09-18},
urldate = {2013-01-01},
booktitle = {Proceedings del Congresso Nazionale AICA 2013},
abstract = {In this paper we present NeuViz, a data processing and visualization architecture for network measurement experiments. NeuViz has been tailored to work on the data produced by Neubot (Net Neutrality Bot), an Internet bot that performs periodic, active network performance tests. We show that NeuViz is an effective tool to navigate Neubot data to identify cases (to be investigated with more specific network tests) in which a protocol seems discriminated. Also, we suggest how the information provided by the NeuViz Web API can help to automatically detect cases in which a protocol seems discriminated, to raise warnings or trigger more specific tests.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
In this paper we present NeuViz, a data processing and visualization architecture for network measurement experiments. NeuViz has been tailored to work on the data produced by Neubot (Net Neutrality Bot), an Internet bot that performs periodic, active network performance tests. We show that NeuViz is an effective tool to navigate Neubot data to identify cases (to be investigated with more specific network tests) in which a protocol seems discriminated. Also, we suggest how the information provided by the NeuViz Web API can help to automatically detect cases in which a protocol seems discriminated, to raise warnings or trigger more specific tests.
@conference{<LineBreak> 11583_2516323,
title = {Challenges and Issues on Collecting and Analyzing Large Volumes of Network Data Measurements},
author = {Enrico Masala and Antonio Servetti and Simone Basso and Juan Carlos De Martin},
url = {https://nexa.polito.it/wp-content/uploads/2024/06/masala2014challenges.pdf},
doi = {10.1007/978-3-319-01863-8_23},
year = {2013},
date = {2013-09-01},
urldate = {2013-01-01},
booktitle = {New Trends in Databases and Information Systems},
journal = {ADVANCES IN INTELLIGENT SYSTEMS AND COMPUTING},
volume = {241},
pages = {203–212},
publisher = {Springer},
abstract = {This paper presents the main challenges and issues faced when collecting and analyzing a large volume of network data measurements. We refer in particular to data collected by means of Neubot, an open source project that uses active probes on the client side to measure the evolution of key network parameters over time to better understand the performance of end-users’ Internet connections. The measured data are already freely accessible and stored on Measurement Lab (M-Lab), an organization that provides dedicated resources to perform network measurements and diagnostics in the Internet. Given the ever increasing amount of data collected by the Neubot project as well as other similar projects hosted by M-Lab, it is necessary to improve the platform to efficiently handle the huge amount of data that is expected to come in the very near future, so that it can be used by researchers and end-users themselves to gain a better understanding of network behavior.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
This paper presents the main challenges and issues faced when collecting and analyzing a large volume of network data measurements. We refer in particular to data collected by means of Neubot, an open source project that uses active probes on the client side to measure the evolution of key network parameters over time to better understand the performance of end-users’ Internet connections. The measured data are already freely accessible and stored on Measurement Lab (M-Lab), an organization that provides dedicated resources to perform network measurements and diagnostics in the Internet. Given the ever increasing amount of data collected by the Neubot project as well as other similar projects hosted by M-Lab, it is necessary to improve the platform to efficiently handle the huge amount of data that is expected to come in the very near future, so that it can be used by researchers and end-users themselves to gain a better understanding of network behavior.
@article{<LineBreak> 11583_2516320,
title = {Strengthening measurements from the edges: application-level packet loss rate estimation},
author = {Simone Basso and Michela Meo and Juan Carlos De Martin},
url = {https://nexa.polito.it/wp-content/uploads/2024/06/basso2013strengthening.pdf},
doi = {10.1145/2500098.2500104},
year = {2013},
date = {2013-07-01},
urldate = {2013-01-01},
journal = {Computer Communication Review},
volume = {43},
number = {3},
pages = {45–51},
publisher = {ACM New York, NY, USA},
abstract = {Network users know much less than ISPs, Internet exchanges and content providers about what happens inside the network. Consequently users cannot either easily detect network neutrality violations or readily exercise their market power by knowledgeably switching ISPs.
This paper contributes to the ongoing efforts to empower users by proposing two models to estimate – via application-level measurements – a key network indicator, i.e., the packet loss rate (PLR) experienced by FTP-like TCP downloads.
Controlled, testbed, and large-scale experiments show that the Inverse Mathis model is simpler and more consistent across the whole PLR range, but less accurate than the more advanced Likely Rexmit model for landline connections and moderate PLR.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Network users know much less than ISPs, Internet exchanges and content providers about what happens inside the network. Consequently users cannot either easily detect network neutrality violations or readily exercise their market power by knowledgeably switching ISPs.
This paper contributes to the ongoing efforts to empower users by proposing two models to estimate – via application-level measurements – a key network indicator, i.e., the packet loss rate (PLR) experienced by FTP-like TCP downloads.
Controlled, testbed, and large-scale experiments show that the Inverse Mathis model is simpler and more consistent across the whole PLR range, but less accurate than the more advanced Likely Rexmit model for landline connections and moderate PLR.
@inproceedings{basso2012estimating,
title = {Estimating packet loss rate in the access through application-level measurements},
author = {Simone Basso and Michela Meo and Antonio Servetti and Juan Carlos De Martin},
year = {2012},
date = {2012-08-17},
urldate = {2012-01-01},
booktitle = {Proceedings of the 2012 ACM SIGCOMM workshop on Measurements up the stack},
pages = {7–12},
abstract = {End user monitoring of quality of experience is one of the necessary steps to achieve an effective and winning control over network neutrality. The involvement of the end user, however, requires the development of light and user-friendly tools that can be easily run at the application level with limited effort and network resources usage. In this paper, we propose a simple model to estimate packet loss rate perceived by a connection, by round trip time and TCP goodput samples collected at the application level. The model is derived from the well-known Mathis equation, which predicts the bandwidth of a steady-state TCP connection under random losses and delayed ACKs and it is evaluated in a testbed environment under a wide range of different conditions. Experiments are also run on real access networks. We plan to use the model to analyze the results collected by the “network neutrality bot” (Neubot), a research tool that performs application-level network-performance measurements. However, the methodology is easily portable and can be interesting for basically any user application that performs large downloads or uploads and requires to estimate access network quality and its variations.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
End user monitoring of quality of experience is one of the necessary steps to achieve an effective and winning control over network neutrality. The involvement of the end user, however, requires the development of light and user-friendly tools that can be easily run at the application level with limited effort and network resources usage. In this paper, we propose a simple model to estimate packet loss rate perceived by a connection, by round trip time and TCP goodput samples collected at the application level. The model is derived from the well-known Mathis equation, which predicts the bandwidth of a steady-state TCP connection under random losses and delayed ACKs and it is evaluated in a testbed environment under a wide range of different conditions. Experiments are also run on real access networks. We plan to use the model to analyze the results collected by the “network neutrality bot” (Neubot), a research tool that performs application-level network-performance measurements. However, the methodology is easily portable and can be interesting for basically any user application that performs large downloads or uploads and requires to estimate access network quality and its variations.
@conference{<LineBreak> 11583_2483986,
title = {The hitchhiker’s guide to the Network Neutrality Bot test methodology},
author = {Simone Basso and Antonio Servetti and Juan Carlos De Martin},
url = {https://nexa.polito.it/wp-content/uploads/2024/06/2011-aica-paper.pdf},
year = {2011},
date = {2011-11-16},
urldate = {2011-01-01},
booktitle = {Congresso AICA 2011},
abstract = {The Neubot project is based on an open-source computer program, the Neubot, that, downloaded and installed by Internet users, performs quality of service measurements and collects data at a central server. The raw results are published on the web under the terms and conditions of the Creative Commons Zero license. This paper is the guide for researchers and individuals that aims to study, build on and analyze Neubot methodology and results. We provide an exhaustive documentation of Neubot’s HTTP test behavior, along with a discussion of the methodology. Besides that, the article shows an analysis of the Turin-area results (in the May-September time interval) and explains the rationale behind the privacy policy, which allows us to publish results as raw data.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
The Neubot project is based on an open-source computer program, the Neubot, that, downloaded and installed by Internet users, performs quality of service measurements and collects data at a central server. The raw results are published on the web under the terms and conditions of the Creative Commons Zero license. This paper is the guide for researchers and individuals that aims to study, build on and analyze Neubot methodology and results. We provide an exhaustive documentation of Neubot’s HTTP test behavior, along with a discussion of the methodology. Besides that, the article shows an analysis of the Turin-area results (in the May-September time interval) and explains the rationale behind the privacy policy, which allows us to publish results as raw data.
@conference{<LineBreak> 11583_2481390,
title = {The network neutrality bot architecture: A preliminary approach for self-monitoring of Internet access QoS},
author = {Simone Basso and Antonio Servetti and Juan Carlos De Martin},
url = {https://nexa.polito.it/wp-content/uploads/2024/06/2011-iscc-paper.pdf},
doi = {10.1109/ISCC.2011.5983857},
year = {2011},
date = {2011-07-01},
urldate = {2011-01-01},
booktitle = {Proceedings of the 2011 IEEE Symposium on Computers and Communications (ISCC)},
pages = {1131–1136},
publisher = {IEEE},
abstract = {The “network neutrality bot” (Neubot) is an evolving software architecture for distributed Internet access quality and network neutrality measurements. The core of this architecture is an open-source agent that ordinary users may install on their computers to gain a deeper understanding of their Internet connections. The agent periodically monitors the quality of service provided to the user, running background active transmission tests that emulate different application-level protocols. The results are then collected on a central server and made publicly available to allow constant monitoring of the state of the Internet by interested parties.
In this article we describe how we enhanced Neubot architecture both to deploy a distributed broadband speed test and to allow the development of plug-in transmission tests. In addition, we start a preliminary discussion on the results we have collected in the first three months after the first public release of the software.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
The “network neutrality bot” (Neubot) is an evolving software architecture for distributed Internet access quality and network neutrality measurements. The core of this architecture is an open-source agent that ordinary users may install on their computers to gain a deeper understanding of their Internet connections. The agent periodically monitors the quality of service provided to the user, running background active transmission tests that emulate different application-level protocols. The results are then collected on a central server and made publicly available to allow constant monitoring of the state of the Internet by interested parties.
In this article we describe how we enhanced Neubot architecture both to deploy a distributed broadband speed test and to allow the development of plug-in transmission tests. In addition, we start a preliminary discussion on the results we have collected in the first three months after the first public release of the software.
@conference{<LineBreak> 11583_2381222,
title = {Rationale, design, and implementation of the network neutrality bot},
author = {Simone Basso and Antonio Servetti and Juan Carlos De Martin},
url = {http://nexa.polito.it/sites/nexa.polito.it/files/aica2010-neubot-paper.pdf},
isbn = {9788890540608},
year = {2010},
date = {2010-10-01},
urldate = {2010-01-01},
booktitle = {Congresso Nazionale AICA, L'Aquila},
publisher = {A.I.C.A.},
abstract = {The “Network Neutrality Bot” (Neubot) is a software application that measures, in a distributed way, Internet access quality of service with a specific emphasis on detection of potential network neutrality violations (such as peer-to-peer traffic discrimination). It is based on a lightweight, open-source computer program that can be downloaded and installed by ordinary Internet users. The program performs background tests: the results are sent to a centralized server (or collection of servers), which publishes them, thus rebalancing, at least in part, the current deep information asymmetry between Internet Service Providers and users. The collected data will allow constant monitoring of the state of the Internet, enabling a deeper understanding of such crucial infrastructure, as well as a more reliable basis for discussing network neutrality policies.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
The “Network Neutrality Bot” (Neubot) is a software application that measures, in a distributed way, Internet access quality of service with a specific emphasis on detection of potential network neutrality violations (such as peer-to-peer traffic discrimination). It is based on a lightweight, open-source computer program that can be downloaded and installed by ordinary Internet users. The program performs background tests: the results are sent to a centralized server (or collection of servers), which publishes them, thus rebalancing, at least in part, the current deep information asymmetry between Internet Service Providers and users. The collected data will allow constant monitoring of the state of the Internet, enabling a deeper understanding of such crucial infrastructure, as well as a more reliable basis for discussing network neutrality policies.
@article{glorioso2009accesso,
title = {Accesso ad Internet e contratti di connettività business to consumer di quattordici fornitori italiani},
author = {Andrea Glorioso and Valentin Vitkov},
year = {2009},
date = {2009-01-01},
urldate = {2009-01-01},
journal = {Diritto dell’informazione e dell’informatica},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
@inproceedings{de2008neubot,
title = {The Neubot project: A collaborative approach to measuring internet neutrality},
author = {Juan Carlos De Martin and Andrea Glorioso},
url = {https://nexa.polito.it/wp-content/uploads/2024/06/ISTAS08_NEUBOT.pdf},
year = {2008},
date = {2008-06-26},
urldate = {2008-01-01},
booktitle = {IEEE International Symposium on Technology and Society, Fredericton (Canada)},
pages = {1–4},
organization = {IEEE},
abstract = {The Internet was designed to be neutral with respect to kinds of applications, senders and destinations. Such design choice made very fast packet switching possible, while preserving, at the same time, strong openness towards unforeseen uses of the Internet Protocol. The result has been an extraordinary outburst of innovation, as well as a level-playing field for citizens, associations and companies worldwide. With the advent of “deep packet inspection” technology, however, fine-grained discrimination of Internet flows is now possible, be that for economical or other reasons. Collecting quantitative data on the behavior of telecommunications providers with respect to traffic discrimination thus becomes crucial, particularly at a time when policy changes are widely discussed.
The “Network Neutrality Bot” (Neubot) project is based on a lightweight, open source computer program, the Neubot, that, downloaded and installed by Internet users, performs distributed measurements of the traffic characteristics of segments of the global Internet. The collected data will allow constant monitoring of the actual state of the Internet, enabling both a deeper understanding of such crucial infrastructure and a more reliable basis for discussing network neutrality policies.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
The Internet was designed to be neutral with respect to kinds of applications, senders and destinations. Such design choice made very fast packet switching possible, while preserving, at the same time, strong openness towards unforeseen uses of the Internet Protocol. The result has been an extraordinary outburst of innovation, as well as a level-playing field for citizens, associations and companies worldwide. With the advent of “deep packet inspection” technology, however, fine-grained discrimination of Internet flows is now possible, be that for economical or other reasons. Collecting quantitative data on the behavior of telecommunications providers with respect to traffic discrimination thus becomes crucial, particularly at a time when policy changes are widely discussed.
The “Network Neutrality Bot” (Neubot) project is based on a lightweight, open source computer program, the Neubot, that, downloaded and installed by Internet users, performs distributed measurements of the traffic characteristics of segments of the global Internet. The collected data will allow constant monitoring of the actual state of the Internet, enabling both a deeper understanding of such crucial infrastructure and a more reliable basis for discussing network neutrality policies.