Panels

  1. Research Panel: Data Analytics for Software Engineering
    Wednesday, Nov. 5, 9:00am - CC - Aula Magna

  2. Industry Panel: Static Analysis
    Wednesday, Nov. 5, 2:00pm - CC - Room A

  3. IWPD workshop panel: Program Debugging – Research and Practice
    Monday, Nov. 3, 11:00am - Hotel RC - Normanna Room

  4. WoSoCer workshop panel: From Research to Certification
    Monday, Nov. 3, 5:00pm - Hotel RC - Santa Lucia Room

  5. ASSURE workshop panel: Formalism, Automation, and Tool Support for Assurance Cases
    Wednesday, Nov. 5, 5:00pm - Hotel RC - Catalana Room


Research Panel:

Data Analytics for Software Engineering


Wednesday, Nov. 5, 9:00am
 
Moderators:
Veena Mendiratta and Sunita Chulani

Panelists:

Ram Chillarege

Catello Di Martino

Michael Lyu

Donato Malerba

 

Large amounts of data are generated in the software lifecycle process: source code, feature specifications, bug reports, test cases, execution traces/logs as well failure data from the field. Traditionally classic statistical methods were used for data analysis and prediction. With the growing use of analytics methods such as machine learning, data mining and data visualization much more of the data can now be analyzed in new ways for descriptive, prescriptive and predictive analysis for software engineering. The goal of this panel session is to provide a forum for discussion on current practices and the vision for future work in the area of data analytics for software engineering in various domains.



Industry Panel:

Static Analysis


Wednesday, Nov. 5, 2:00pm - CC - Room A
 
Moderator:
Gabriella Carrozza

Panelists:

Salvatore Scervo, Selex ES

Roberto Giacobazzi, University of Verona

Quentin Ochem, AdaCore

Vladimir Sklyar, RADYI

Vadim Okun, National Institute of Standards and Technology (NIST)

 

It is estimated that each developer injects on average one defect each 8 lines of code, and demonstrated by eminent names that nobody can prove any software to be defect free. The best approach for dealing with failures is their early detection, i.e., the attempt to detect software defects at the earliest stage (i.e., in the same SDLC phase in which they get injected) in order to avoid their degeneration into failures and to reduce fixing and mainteinance costs. Static Analysis is a powerful enabler for early detection: at compile time, it’s a mean to identify programming errors that can escape both compilers' detection facilities and functional testing campaigns. For this reason, many respectful software giants, like Microsoft and Nasa, have been massively using such a technique with very good results.
Apart from its traditional application to state the compliance of developed code to a set of imposed coding standards, static analysis can provide much more. Many studies explicitly show that tere exists a positive correlation between static defects and post release failures. Also, they seem to be very good indicators of application vulnerability as well.
This panel aims to create a stimulating discussion about how big companies feel with static analysis real word experience and feedback about the use of such technique in critical sw systems issues and limitation of the most widely used cots tools for static analysis, expected benefits and estimated ROI.
Panelists will play a role game trying to figure out challenges, issues and sinergies related to static code analysis application in industry and academia.



IWPD workshop panel:

Program Debugging: Research and Practice


Monday, Nov. 3, 11:00am
 
Moderator:
W. Eric Wong, University of Texas at Dallas

Panelists:

Mladen Vouk

Franz Wotawa

Regardless of the effort spent on developing a computer program, it may still have bugs. In fact, the larger and more complex a program, the higher the likelihood of it containing bugs. When the execution of a program on a test case fails, it reveals that there are bugs in the program. Then, the burden is on the programmers to locate and fix these bugs. However, program debugging can be extremely time-consuming and tedious, especially given the size and complexity of software we have today. Manual debugging is certainly not the right approach.
With this realization, researchers have proposed various techniques to assist programmers in finding and fixing bugs more effectively and efficiently. Yet, many questions still remain open and need to be further explored:

  • How usefully can these techniques be transferred from laboratory settings to real-life industry environments?
  • Are automated debugging techniques actually helping practitioners?
  • Are the assumptions adopted by research-oriented debugging techniques valid in practice?
  • Do we have solid data from rigorous case studies to prove the feasibility and advantage of using exiting research prototypes?
  • What are the gaps between the best-of-breed practices in industry and the most advanced techniques proposed in academia?
  • What are the limitations and challenges of current research in program debugging and what are the most urgent needs in practice?
Each panelist first gives a short presentation to report his or her experience in applying research methodologies and techniques to debugging large and complex real-life software systems and the challenges that they have had to overcome. The floor is then open for the audience to express their concerns and provide comments on the current research and practice of program debugging. A discussion for possible solutions is also conducted.



WoSoCer workshop panel:

From Research to Certification


Monday, Nov. 3, 5:00pm
 
Moderator:
Marc Förster

Panelists:

Henrique Madeira

Myron Hecht

Nuno Silva

Jonny Vinter

The safety-critical industry as a whole followed for decades a conservative approach to safety. On the one hand, regulatory authorities, fearing the potential risks, reject or discourage the adoption of recent innovations, limiting the complexity of functions allocated to software, which could otherwise provide benefits to users and a "competitive advantage" to industries. On the other hand, researchers are often interested in theoretical aspects of their own research, not considering the market and industrial needs. This gap can be attributed to a number of factors, such as communication issues, cultural differences, fear of change, etc. Topics of interest in the field of software reliability include (but are not limited to):

  • Fault injection
  • Formal methods
  • Requirement engineering
  • Verification of emergent behaviors and systems-of-systems
  • ...
This panel will discuss the main challenges behind the difficult transition from research to real-world applications, and potential approaches to ease such transition. Invited panelists include experts from academia and industry that will share their experience and perspectives matured in large international projects.



ASSURE workshop panel:

Formalism, Automation, and Tool Support for Assurance Cases


Wednesday, Nov. 5, 5:00pm
 
Moderator:
Ibrahim Habli

Panelists:

John Knight, University of Virginia, USA

Dave Higham, Delphi Diesel Systems, UK

Kenji Taguchi, AIST, Japan

Over the last twenty years, there has been increasing interest in using structured argumentation notations such as GSN (Goal Structuring Notation) or CAE (Claims-Argument-Evidence) to communicate the structure of the argument. While such arguments are structured, they remain informal. There is increasing interest in exploring how these informal arguments may be modelled in formal logic, potentially opening up benefits of forms of analysis and automation not possible with informally recorded arguments. This panel will discuss the considerations in balancing the role of informal and formal logic in modelling assurance case arguments.