Towards Modeling the Modern Distributed Systems Fabric
Schahram Dustdar, Vienna University of Technology, Austria
Megamodels on the Catwalk
Ralf Lämmel, University of Koblenz, Germany
A Modeler's Journey: From Hostility to Possibility
Jim Logan, Director of Semantic Technologies & Interoperability at No Magic, Inc., United States
Towards Modeling the Modern Distributed Systems Fabric
Schahram Dustdar
Vienna University of Technology
Austria
Brief Bio
Schahram Dustdar is Full Professor of Computer Science heading the Research Division of Distributed Systems at the TU Wien, Austria. He holds several honorary positions: University of California (USC) Los Angeles; Monash University in Melbourne, Shanghai University, Macquarie University in Sydney, and University of Groningen (RuG), The Netherlands (2004-2010). From Dec 2016 until Jan 2017 he was a Visiting Professor at the University of Sevilla, Spain and from January until June 2017 he was a Visiting Professor at UC Berkeley, USA.
From 1999 - 2007 he worked as the co-founder and chief scientist of Caramba Labs Software AG in Vienna (acquired by Engineering NetWorld AG), a venture capital co-funded software company focused on software for collaborative processes in teams. Caramba Labs was nominated for several (international and national) awards: World Technology Award in the category of Software (2001); Top-Startup companies in Austria (Cap Gemini Ernst & Young) (2002); MERCUR Innovation award of the Austrian Chamber of Commerece (2002).
He is founding co-Editor-in-Chief of the new ACM Transactions on Internet of Things (ACM TIoT) as well as Editor-in-Chief of Computing (Springer). He is an Associate Editor of IEEE Transactions on Services Computing, IEEE Transactions on Cloud Computing, ACM Transactions on the Web, and ACM Transactions on Internet Technology, as well as on the editorial board of IEEE Internet Computing and IEEE Computer. Dustdar is recipient of the ACM Distinguished Scientist award (2009), the IBM Faculty Award (2012), an elected member of the Academia Europaea: The Academy of Europe, where he is chairman of the Informatics Section, as well as an IEEE Fellow (2016).
Abstract
With the advent of Edge Computing and the coming of age of Artificial Intelligence, there is a strong demand to integrate Edge Computing, Cloud Computing, and AI, which gives birth to Edge Intelligence. In this talk, we discuss insights into this new interdisciplinary field from a broader vision and perspective. We discuss the core concepts and the
research roadmap, which should provide the necessary background for potential future research programs in Edge Intelligence and discuss initial thoughts on the role of models and abstractions.
Megamodels on the Catwalk
Ralf Lämmel
University of Koblenz
Germany
Brief Bio
Ralf Lämmel is Professor of Computer Science at the University of Koblenz-Landau in Germany since 2007. In the past, he had held positions at Facebook London, the University of l'Aquila, Microsoft, USA, the Free University of Amsterdam, CWI (Dutch Center for Mathematics and Computer Science), and the University of Rostock, Germany.
His research and teaching interests include software/data engineering, software reverse engineering, software re-engineering, software language engineering, mining software repositories, program comprehension, functional programming, grammar-based and model-based techniques, and megamodeling. In his latest work at Facebook, he applies machine learning (in a broad sense) in an infrastructural context while developing an increasing interest in data engineering and science. Areas of application concern ownership management, infrastructure simulation, and developer workflow analysis.
Ralf Lämmel is one of the founding fathers of the international summer school series on Generative and Transformational Techniques on Software Engineering (GTTSE) and the international conference on Software Language Engineering (SLE). He is the author of Springer textbook on Software Language Engineering: Software Languages: Syntax, Semantics, and Metaprogramming, Springer, 2018, which received the Choice Award "Outstanding Academic Title" in 2019.
Abstract
Megamodels are domain-specific models (like most or all models).
What's the domain in the case of megamodels? It depends!
It may be, for example, the domain of model transformation or the domain of software technologies.
Roughly, a megamodel is a model whose elements are software artifacts (such as models or programs).
In fact, those artifacts aren't necessarily as concrete as actual models; they could be opaque, as in the case of libraries.
The relationships in a megamodel thus relate software artifacts (with conformance being the obvious example).
Hold on, megamodels also capture knowledge about the domain.
Thus, naturally, model elements and relationships also concern software concepts and software languages.
Megamodels serve abstraction (like most or all models).
What sort of abstraction is going on here? This is where things get interesting.
Megamodels abstract by treating model elements for software artifacts effectively as variables.
Model elements for software concepts and languages are supposedly drawn from an appropriate ontology.
All that matters are the constraints on the model elements expressed by the relationships.
Thus, megamodels are like patterns of conglomerations of related artifacts, concepts, and languages.
Megamodels come in many flavors: prescriptive, descriptive, executable, exemplified, renarratable, etc.
Time for a catwalk to show off with a model of megamodeling.
That is, let's understand all the dimensions of characterizing megamodels.
Megamodeling is a niche, if you go by explicit mentioning of the paradigm in software engineering.
Megamodeling is omnipresent, if you acknowledge all related hacks and workarounds that are found in the wild.
In this talk, I also hint at where I saw some or where I wanted more megamodeling at Facebook in software development.
Let's discuss how megamodeling could be generalized and used more profoundly in software engineering.
To this end, we need to continue working on these premises:
i) Megamodeling languages are DSLs, subject to designated efforts in analysis, design, and implementation.
ii) Especially analysis involves ontology engineering for concepts, languages, types of artifacts, and relationships.
iii) The most important DSL semantics serves validation of megamodel instances against a megamodel.
iv) The alignment of megamodels and reality requires MSR-style information retrieval and reverse engineering.
v) What's the AST to classical software languages, that's the knowledge graph to megamodeling DSLs.
A Modeler's Journey: From Hostility to Possibility
Jim Logan
Director of Semantic Technologies & Interoperability at No Magic, Inc.
United States
Brief Bio
Jim Logan is the Director of Semantic Technologies and Interoperability, and the Product Manager of the Cameo Concept Modeler at No Magic. As the company’s expert in semantics, his primary focus is determining customer needs, forming product vision, and delivering solutions to customers. His secondary focus is leading the development team towards delivery without architectural entropy. Over the past 20 years, Jim has been active at the Object Management Group, currently serving as the co-chair of the Analysis and Design Task Force, and developing a new specification for semantic modeling using the Unified Modeling Language. Before joining No Magic, Jim was an accomplished architect, with more than 30 years of professional experience with the modeling, architecture, and development of systems in a wide range of domains, including embedded real-time telecommunications, network management, finance, acquisition, human resources, military, and health care.
Abstract
Although industry became increasingly hostile towards modeling over the past twenty years, things seem to be changing, which is great news for those of us with a passion for modeling. We must be sure not to repeat the mistakes that created such hostility, though. This talk exhibits avoidable problems from real projects, explains how modeling reality with semantics-based techniques achieves better results, and arms you with knowledge of tools and organizations that are changing the modeling landscape within industry.