Tutorial 1 (Full day) - The Model Oriented Domain Analysis tutorial
Presenter: Jorn Bettin, Sofismo, Switzerland
Model Oriented Domain Analysis is a state of the art method for performing commonality and variability analysis in the context of a software product line or family of applications.
The presenter has used Model Oriented Domain Analysis to design eleven families of complementary domain specific languages. The approach is based on 15 years of practical experience of using model driven approaches and domain specific languages to simplify the design of software products and application families. It combines fundamental principles from product line engineering methods such as FAST and KOBRA with the possibilities of modern model driven tooling for rapidly designing and implementing domain specific languages.
The tutorial includes an introduction of basic domain analysis concepts. Participants are not assumed to be familiar with software product line engineering concepts. It is designed to be highly interactive, with 50% of the time allocated to practical domain analysis. The intention is to enable participants to develop draft domain specific language designs as part of the tutorial.
Participants interested in validating their designs will be assisted in their first steps of using Eclipse EMF as a meta modelling tool. Depending on the level of knowledge of the audience, a short demonstration of model driven software development with domain specific modelling languages will be included.
The tutorial equips participants with the knowledge necessary to assess the potential of Model Oriented Domain Analysis to increase the degree of software design and development automation in their own organisation.
Tutorial 3 (Half day) - Tips, tricks, and traps of Empirical Software Engineering
Presenters: Tim Menzies, West Virginia University, USA; Emilia Mendes, The University of Auckland, New Zealand
As software engineering matures, so to does our ability to better understand, analyze and report the value of different software engineering techniques and tools. Based on twenty years of experience in the field, the presenters report the best (and worst) ways to report and analyze empirical SE results. This tutorial will also review some of the best, and worst, reports of empirical results in the literature (names will be named, fingers will be pointed). The aim of the tutorial is not to introduce the theory of empirical software engineering, but to provide real examples and straightforward guidelines to help practitioners, researchers and students critically assess the results they are presented with in the existing literature, and also better plan and report their own studies (e.g. when comparing two different tools or development methodologies).
At the end of this half-day tutorial, attendees will
understand how to document the value (or otherwise) of some proposed
software engineering technique. Attendees will also know the top-10 “do
not dos” of reports of empirical SE
Tutorial 4 (Half day) - Runtime Verification with Monitoring Oriented Programming
Presenter: Grigore Rosu, Department of Computer Science, University of Illinois at Urbana-Champaign
The objective of this tutorial is to introduce attendees to the effectiveness of runtime verification in software development, as well as to present the use of the monitoring oriented programming (MOP) framework to perform runtime verification.