Improving Protocol Standards for a more Trustworthy Internet

Lead Research Organisation: University of Glasgow
Department Name: School of Computing Science


This project will make the Internet's infrastructure and applications more
reliable and secure, more trustworthy and less vulnerable to cyber attack,
by improving the engineering processes by which the network is designed.

The Internet comprises a large number of laptops, smartphones, and other
edge devices, connecting to servers located in data centres around the
world via numerous interconnecting links and switching devices. To make
this work, all the devices must agree on how they should communicate. That
is, they must speak a common language, known as a "protocol" that describes
the format of the information that is sent and the operations to be
performed. There are many such protocols, describing the different types
of communication. For example, the HTTP protocol describes how browsers
fetch pages from websites.

To ensure interoperability between devices from different manufacturers,
these protocols are described in a series of standards documents, published
by organisations such as the Internet Engineering Task Force (IETF). These
standards are developed incrementally by teams of engineers working over
several months, or perhaps years, to produce a written specification that
describes how the protocol should work. Despite the best efforts of those
developing the standards, however, the results are often found to contain
inconsistencies and ambiguities. These can lead to devices from different
manufacturers failing to work together, due to differing interpretations of
the standard, and in the worst cases can lead to vulnerabilities that open
devices up to cyber attack.

Much of the reason for these inconsistencies and ambiguities is that the
protocol standards are written in English, and hence there's no automated
way of checking them for correctness. Researchers have proposed ways of
describing protocols using methods (known as "formal languages") that are
more like computer programming languages, and that would allow automated
consistency checks to be made, but these have not been widely adopted by
the standards community.

This project will study the social, cultural, and educational barriers to
adoption of these new techniques, to understand why standards continue to
be written in English. We will explore the perceived limitations of the
alternatives, to understand why they've been adopted in certain niches,
and for certain purposes, but are not used more broadly in standards

We'll then formulate a model for the adoption of formal languages and their
supporting tools in the protocol standards community, and use it identify
areas that are ready to increase use of such techniques in their standards.
Finally, we'll use the knowledge gained to propose formal languages, that
are designed to fit the way the standards developers work, and begin the
process of introducing these into the standards process, to improve
protocol specifications and make them less vulnerable to attack.

The work will be conduced in the IETF, since it's the key international
technical standards body developing Internet protocol standards. The aim is
to improve the quality and trustworthiness of the standards that the IETF
develops, and increase security, robustness, and interoperability of the
Internet. The novel engineering research idea we will explore is that
formal languages need to be adapted to the community of interest. It is not
enough that they help solve the technical problem of how to specify a
protocol: they must do so in a way that fits the expertise and culture of
those who need to use them. Research into structured approaches and formal
languages for protocol design has not yet considered the nature of the
standards process, and hence has not seen wide uptake. We start with a deep
awareness of the standards process, consider social and technical barriers
to uptake, and propose new techniques to improve the way standards are

Planned Impact

We will make the Internet's infrastructure and applications more reliable
and secure, hence more trustworthy and less vulnerable to cyber attacks, by
improving the underlying protocol standards. This will benefit network
operators and developers of networked applications, enabling them to build
more interoperable, robust, and secure systems. In turn, this will benefit
cyber society by making the Internet cyber infrastructure more trustworthy.

We will engage with the international protocol standards community, in
particular the Internet Engineering Task Force (IETF), to introduce new
approaches to protocol specification to make it easier to automatically
check the specifications and standards they produce for consistency and
correctness. This will help reduce the number of errors in the standards,
making them more trustworthy and less likely to contain flaws that lead
to security vulnerabilities. It will also increase productivity of the
standards setting organisations, since the engineers working in them will
be able to concentrate more on solving new problems than on maintaining
and correcting existing specifications. The PI has long experience working
with the IETF, developing protocol standards and chairing working groups,
and has the expertise and connections to conduct and support this work.

We will work with the academic researchers in computer networks, formal
methods, and usability communities. We will use approaches from usability
work to introduce concepts, ideas, and techniques from previous research
in how to specify and verify network protocols into the standards world,
and bridge the gap between the standards and research communities. We
expect to bring significant input to researchers studying network protocol
specification and verification in terms of what works, and what doesn't,
in real world standards. Through the new techniques we propose, we will
also be able to extract information to allow analysis of emerging new
protocols that the research community has previously not been able to study
until the specifications are complete. And, by encouraging this study early,
we hope to encourage a feedback cycle between research and standardisation.

We will engage with the academic community by targeted publication of
results at networking conferences such as ACM SIGCOMM CoNEXT, and ANRW,
IEEE Infocom, IEEE ICNP, and journals such as IEEE Internet Computing,
Transaction on Networks, Computer Networks, etc., but also at venues such
as the ACM SPLASH conference (interplay between languages, types, and
protocols), and the IEEE security and privacy conferences (workshop on
languages theoretic security, etc).

We'll organise workshops to bring the research and standards communities
together to share ideas and experiences. The PI co-founded the ACM/IRTF
Applied Networking Research Workshop, that takes place co-located with IETF
with strong attendance from both communities, and that would be an ideal
location for panels or special sessions on this topic. Colleagues in the
School of Computing Science at Glasgow have strong track records in use of
formal methods and type theory for communication, and we will work with
them to reach a broader community. We will also work towards setting up an
Internet *Research* Task Force research group in this area, to further
consolidate interactions between the communities.

To summarise, this work will benefit the broader society by improving the
standards that application developers and network operators build upon, so
allowing them to make more robust, more secure, systems, supporting a safe
and effective cyber-society.


10 25 50