Archive
Blog - posts for November 2019
Nov 28 2019
PROFES 2019, Barcelona
Event: 20th International Conference on Product-Focused Software Process Improvement
Date: November 27 to 29, 2019
Place: Barcelona, Catalunya, Spain
Registration: https://profes2019.upc.edu/registration/
Date: Nov 28, 2019
Time: 14:15-14:30
Session: 4A - Software development II
Paper: Software Knowledge Representation to Understand Software Systems (Short Paper)
Authors:Victoria Torres, Miriam Gil and Vicente Pelechano
Check out the Profes'19 pictures and presentations
Nov 26 2019
Stefan Ilić, Technikon Researcher
Sharing a broad knowledge of an entire software
DECODER is an ambitious project with the goal of making large software development process easier. It offers the tools to unify all component artefacts, from code to documentation, in a single database with relevant relations between them.
Nov 15 2019
SFScon 2019, Bolzano, Italy
Event: SFScon 2019
Date: November 15, 2019
Place: Bolzano, Italy South Tyrol
The DECODER project will be presented during the open source conference track organised by OW2 at SFSCon, one of Europe’s most established annual conferences on Free Software, organized in Italy South Tyrol region. This track will take place on Friday Nov. 15 after the afternoon break. It will introduce several European research projects.
Nov 14 2019
Four DECODER & TESTAR Workshops
Between November 14th and 15th, 2019, the UPV team provided four workshops and gov/industry meetings with DECODER and TESTAR projects presentations:
- Meeting with ING Bank, European partners of Industrial-grade Verification and Validation of Evolving Systems project, to presents and discuss the task of TESTAR, in Amsterdam, Netherlands
- TESTAR training workshop at Xebia company, in Amsterdam, Netherlands
- Project presentation and TESTAR HandsOn with the Ministry of Justice and Security team from Gouda, Netherlands
- TESTAR training workshop at Newspark BV company, Nieuwegein, Netherlands
Nov 05 2019
Is BERT a Game Changer in NLP?
BERT (Bidirectional Encoder Representations from Transformers) is an open-sourced NLP pre-training model developed by researchers at Google in 2018. It has inspired multiple NLP architectures, training approaches and language models, including Google’s TransformerXL, OpenAI’s GPT-2, ERNIE2.0, XLNet, and RoBERTa.
For instance, BERT is now used by Google Search to provide more relevant results. And it can also be used in smarter chatbots with conversational AI applications, expects Bharat S Raj.