Business Development And Innovation

The Competence Center of the Faculty of Informatics at ELTE University, in addition to carrying out research projects relevant to the industry, places great emphasis on establishing a strong innovation environment within the Center, building upon the ongoing projects to lay the foundation for the long-term transformation of the Faculty. Our prominent endeavor is to achieve significant success by transforming research results generated at the Faculty into commercially viable projects, through their development, enhancement, and incubation.

To achieve this general goal, we consider it important to:

  • Prepare for the comprehensive commercial exploitation, further exploration, and development of research results generated under the umbrella of the Competence Center.
  • Create the conditions and processes for the commercial utilization of research results originating from any corner of the ELTE Faculty of Informatics (in this regard, we greatly rely on the ongoing InnoChange project involving the participation of ELTE FI).
  • Expand and keep up-to-date the skills, competencies, and knowledge required for successful innovation.
  • Strengthen the constructive, proactive, ambitious, collaborative, creative, and failure-tolerant mindset at the Faculty, which is essential for sustaining successful innovation activities.
  • Ensure the long-term sustainability and positive impact of the Competence Center itself.
    The first step in achieving our goals is to reevaluate our current research projects from a business perspective, focusing on questions such as their potential to address business problems, their relevance to (additional) industrial stakeholders, and their ability to meet market demands.

The following presents a portion of the projects running within the Competence Center and those related to its work, based on these criteria.

Software-Compensated Image Processing

Business/Market Problem:

The majority of radiological examinations expose patients to a non-negligible amount of radiation (this is also true for CT, MRI and SPECT). Radiation exposure can be reduced by injecting smaller amounts of radioactive tracers, but smaller amounts of radioactive tracers typically lead to poorer image quality, which compromises the accuracy and reliability of diagnosis.

The joint development of Mediso and ELTE IK aims to improve the quality of images produced during radiological examinations – specifically CT and SPECT – using mathematical methods and artificial intelligence-based procedures, allowing the quality of the images, and thus the accuracy and reliability of the diagnosis, to be maintained without compromising the quality of the images, even when injecting smaller amounts of radioactive tracers.

As a first step, the project will focus on improving image processing algorithms used in cardiac radiology, in particular the automatic evaluation of so-called flow-through procedures used for the examination of cardiac images of radiological patients. One part of the task will be the software mapping of the human heart to the system (a prerequisite for the application of artificial intelligence and machine learning technologies), the other part will be the creation and sharpening of algorithms to improve image quality and thus automate image recognition and evaluation. Automation based on software-enhanced imaging and analysis has the additional advantage of shortening the time needed for the process, which means that more examinations can be performed in a unit of time and more patients can be seen.

How Mediso benefits from the solution:

A measure of the project’s success is that the image quality enhancement algorithms created will be formalised and upgraded to a level of quality that can be directly incorporated into Mediso’s internationally competitive, state-of-the-art medical imaging equipment, further enhancing and modernising the data acquisition, reconstruction and image processing software running on Mediso’s world-renowned radiology equipment. This success criterion has already been met for cardiac models, reorientation protocols and motion correction solutions developed in collaboration with ELTE, and further results are expected to be incorporated into the Mediso product portfolio in the next one to two years. The creation of a new product category capable of accommodating interchangeable apertures promises to be particularly exciting and will represent an essential niche in the radiology equipment market by creating a transition between the two types of radiology equipment currently on the market: combining the advantages of general purpose SPECT and CT cameras, which can be used for full-body scans, with those of cameras specialised for specific body areas and groups of imaging protocols.

What part/segment of the task is being solved by ELTE IK:

A team of ELTE IK researchers and students has undertaken several key subtasks, including the optimisation of the artificial intelligence – so-called “supervised” and “unsupervised” deep learning – techniques used in the project to build the cardiac models, the design of the data structures and procedures that will frame the development, the prototyping of the software developed with scientific rigour, and the development of the theoretical foundations of the overall software system.

Why Mediso partnered with ELTE IK:

ELTE IK researchers brought to the project a professional-theoretical background and R&D competencies that ensured both a solid scientific basis and the timeliness of the developments, and a level of presence that was not required even in Mediso, which has an exceptionally strong R&D capacity. A further benefit of the cooperation for Mediso is that it has resulted in the development of a base of dozens of well-trained students who understand Mediso’s systems not only at the level of theorem definitions but also at the level of concrete practical experience and can therefore work effectively on their further development. The same benefit is reflected in the fact that the students involved in the projects receive a clearly useful, tangible, practical education for the labour market.

More details:

The personal story of Ádám Szűcs: as an ELTE doctoral student, he was given a free hand by Mediso to build the foundations of the project and to organise and manage – in addition to his ELTE status, now as an expert employed by Mediso – the implementation of the complex project over several years.

Specialised Test Simulator

Business/Market Problem:

A Mediso méltán világhíres multimodalitású – vagyis különböző diagnosztikai vizsgálatok kombinációit (SPECT/CT, PET/CT, PET/MRI) lehetővé tevő – berendezéseinek előállítási költsége egyediségük, komplexitásuk és variabilitásuk okán meglehetősen magas, gépenként milliárd forintos nagyságrendű. Ugyanezen egyediség, komplexitás és variabilitás szükségessé teszi a megrendelők számára leszállított gépek teljes körű és alapos előzetes tesztelését, ami ebben az értéktartományban önmagában is óriási költségeket jelent a cég számára. A tesztelés erőforrás igényének és anyagi terhének kordában tartása ennek megfelelően kiemelkedően fontos üzleti cél a Mediso számára.

The aim of the project, which is supported by ELTE, is to minimise the testing tasks necessary for the design, prototyping and production of the machines developed by Mediso, by running them on the valuable hardware itself, i.e. the future end products built in their physical reality, and to maximise the use of a software simulation environment. Dedicated hardware is also needed to run software tests, but the cost of building and running it is of a different order of magnitude from that of testing on the final products.

How does Mediso benefit from this solution:

A specialised simulator tailored to Mediso’s product development needs will bring massive savings to Mediso, while also allowing for optimisation during the product development process, making the latter more efficient and effective. Similar simulators do exist on the market, but at a price that is prohibitively high for Mediso and typically results in dependence on the supplier.

Refactoring Of The Mediso Software Base

Business/market problem:

A software development company and the software it develops go through a parallel evolution. First, a monolithic team writes monolithic software, as this is the most efficient in the initial phase. However, as the company and the software evolve, the monolithic software architecture introduces multi-level scaling problems.

First of all, in order to have multiple developers working on a software project, it is essential to have parallel development processes. However, in the case of a monolithic system, modifications affect the entire software, forcing developers to wait for each other. To avoid idle time, developers usually start working on multiple tasks simultaneously, which hinders the desired level of concentration. Moreover, completed developments create additional work for developers working on unfinished modifications, and non-human resources utilized during development cannot be effectively utilized.

To increase the amount of work that can be accomplished by the software within a unit of time, the software needs to be developed vertically, allowing a single instance of the product to perform multiple tasks. Alternatively, in the case of horizontal scaling, multiple instances of the product need to be run in parallel. However, this is not optimal for monolithic software because scaling requires the scaling of all software components of every instance, and the architecture does not allow for independent scaling of components.

Finally, there is a growing demand on the customer side for products to move towards a “thin-client” approach, meaning that only the user interface runs on the client’s computer, while the core functionality of the software resides on a separate “server” computer. Another requirement for modern software products is their efficient operation in private and public cloud-based environments. These solutions are difficult to implement in a monolithic architecture.

In the field of nuclear medicine and modern hybrid imaging techniques, Mediso Ltd., which is among the international leaders, has reached a critical size in terms of both the number of developers and customer demands. However, the monolithic software architecture is hindering its further development and scalability of its software systems. The transition to more modern software technologies is also complicated by the special sensitivity and security requirements of Mediso’s products, where cybersecurity, which plays an important role in distributed systems, takes precedence over all other aspects for the company. This is especially true when certain components run on public or at least non-customer-controlled computers.

Solution:

The modular architecture that allows independent scaling of individual software components provides a solution to the previously outlined problems. Proper separation of software components greatly facilitates overcoming development and user scalability issues.

Since the components have independent lives from a development perspective, the number of “competition situations” that arise during resource allocation significantly decreases. Unmodified components in a given development cycle do not burden the development infrastructure, and developers are not faced with idle time issues.

The scalability of the product is also positively affected by the modular architecture. Development time for horizontal scaling decreases, while vertical scaling benefits from the separate scalability of components, creating a more favorable resource utilization situation. A software based on a modular architecture simplifies the implementation of thin-client and distributed, cloud-based solutions.

Naturally, the data flow between components must also meet enhanced security requirements, with network communication, data stored on disk, and potentially in memory requiring adequate encryption. Another consideration is to ensure that unauthorized individuals cannot access patient data, even if they have direct access to the user interface of the product or any of its components.

What part/slice of the task does ELTE Faculty of Informatics solve:

In response to a request from Mediso Ltd., the ELTE Faculty of Informatics has embarked on the development of a modern architecture suitable for distributed software. The work involves creating a framework that enables separate development of individual components of a modular software product. Throughout the design process, the usability of existing solutions has been a primary consideration, ensuring that functionalities that fulfill the original requirements can be easily transferred to the new software. The framework facilitates efficient and secure communication between components while ensuring the security of stored data.

Among the deliverables is a prototype of a medical image viewer application that demonstrates the functioning of the framework. The application consists of two components: a thin-client viewer and a background component that instructs the viewer on what to display on the screen. Two versions of the thin-client viewer were developed, a traditional graphical user interface (GUI) application and a web-based viewer. Both serve only as display interfaces, with the actual work performed by the server component.

During the collaboration, researchers and students from the ELTE Faculty of Informatics maintain continuous contact with the developers of Mediso Ltd. Regular meetings ensure that both sides receive continuous feedback throughout the design and implementation process, guaranteeing that the framework and the prototype application always meet current needs. The knowledge accumulated during the university work is shared with Mediso’s developers through corporate training programs, facilitating the utilization of the framework in a corporate environment.

What advantages do Mediso and ELTE gain from the project:

Through the project, the corporate partner gains access to a framework that enables the transformation of their existing monolithic software into a modular software, while acquiring the knowledge necessary for usage and development through continuous collaboration and communication efforts. This allows them to possess a future-proof software that meets the expectations of the industry and customers, enabling rapid implementation of new functionalities, exploration of new verification directions, and the exploitation of previously unattainable cost-saving opportunities due to more efficient resource utilization.

The researchers and students from the ELTE Faculty of Informatics engage in valuable research and development work that addresses real user needs in the market. This has a positive impact on all participants in the project. Students can immerse themselves in the real-life challenges faced by the corporate partner during their studies, making them immediately deployable and easily trainable upon completion of their studies. The researchers solve important problems for the industry, while the results are also valuable from an academic standpoint, potentially forming the basis for publications.

Building Healthcare Data Repositories

Business/Market Problem:

Our everyday online activities generate a vast amount of data: our social media activity, search queries, the use of cloud-based computing solutions, and the utilization of various internet-connected devices all continuously produce data, to name just a few of the most common data sources. It is no wonder that data storage, management, and security have become central concerns for everyone, whether individuals, companies, or governments. The world is craving data management systems that are simultaneously secure and protect the privacy rights associated with the data.

The other side of the coin, however, is that the data collected from various sources can be valuable for numerous commonly important purposes, and the effective utilization of data often requires central accessibility and processing. What often obstructs centralized accessibility is the fragmentation and inconsistency of data, formal incompatibilities, commonly referred to as “data silos,” in the corporate business environment.

The difficulty arises from the fact that, on one hand, the protection of privacy rights and data security, and on the other hand, the efficiency of data utilization, often point in opposite directions. This contradiction is nowhere more evident than in healthcare.

Solution:

An increasingly prevalent approach to managing large amounts of data is the use of “data lakes,” which are repositories that gather different types of data in their raw, original, unprocessed form. In other words, the stored data is not standardized or prepared in any specific usage or goal-oriented manner. The main goal of the joint project between ELTE and E-Group is to establish a data lake for collecting, storing, and analyzing various types of healthcare data. In the current phase of the project, these data primarily consist of the data generated by the healthcare activities and services of the University of Pécs, along with related external data. The “data lake” to be built aims to support data-driven healthcare innovation, bringing together and facilitating interactions and information exchange among IT experts, engineers, doctors, researchers, and clinicians, thereby facilitating the resolution of pressing issues such as GDPR compliance.

The collection and integration of diverse data, ranging from retrospective health records to prospective data from healthcare devices, is not a trivial task. Several complex subtasks need to be addressed, including data protection (i.e., anonymization of data), protection of intellectual property (through data watermarking), recording the data’s origin, analyzing data integrity, data-driven identification and authentication, and preparing the system for federated learning applications (hyperlink to the “Federated Learning on Healthcare Data” project).

Significance of the project:

While some specific outcomes of the project, whether in the fields of informatics, healthcare services, or law, may be significant in themselves, the full realization of the project opens up new possibilities for data utilization that simultaneously affect all areas of healthcare. The utilization of the data collected and prepared by the project can lead to improvements in the quality and cost-effectiveness of healthcare delivery, as well as a series of anticipated innovations in diagnostics, therapy, and medical processes in general. Considering the sensitivity and enormous strategic significance of healthcare data, its magnitude cannot be emphasized enough.

The part of the project carried out by the Faculty of Informatics at ELTE:

The research group of the Faculty of Informatics at ELTE is responsible for the theoretical foundation of the project, particularly focusing on the implementation of security strategies and protocols. The construction of data structures, main system prototypes, and the development of internal communication frameworks were carried out in close collaboration between ELTE and E-Group.

Why did E-Group choose the Faculty of Informatics at ELTE as a partner:

On one hand, the Faculty of Informatics at ELTE has numerous references and relevant project experience. On the other hand, under the leadership of Dr. Attila Kovács and Dr. Péter Ligeti, the cybersecurity department of ELTE plays a leading role in this field of science. E-Group is an industrial player and, as a result, lacks the necessary theoretical and scientific competencies for fully addressing the project’s security and data privacy aspects. Furthermore, both E-Group and ELTE are members of the EIT community and have previously collaborated closely within that framework.

Further details:

Yuping Yan, the project’s principal researcher, is concurrently a doctoral student at ELTE and a member of the E-Group team.

With her work focusing on techniques that ensure the privacy of large-scale databases, Yuping Yan plays a bridging role between the two organizations in the project. The results of the project have been published in six conference presentations and one journal article.

Federated Learning On Healthcare Data

Business/Market Problem:

As machine learning models increasingly support diagnosis and comprehensive statistical methods become essential parts of healthcare processes, there is a significant expectation and demand for the widespread application of these technologies. However, training efficient machine learning models, particularly artificial neural networks, requires a substantial amount of relevant data that is typically not available to individual healthcare institutions.

One potential solution could be the consolidation of locally available databases. However, this is not feasible due to the extraordinary sensitivity of the data and understandably stringent regulations.

The application that truly offers a solution to this problem is called federated learning. It is a model training method based on collaboration between databases, which does not require the data to leave the host’s system. Within the framework of the ELTE Software and Data-Intensive Services Competence Center (ELTE KK), the university and E-Group have set the goal of developing a comprehensive system that enables the development of healthcare applications based on this principle, providing strong data protection guarantees for patients. The project is closely related to another parallel development by ELTE and E-Group, which aims to create data lakes for the collection and management of healthcare data.

Building a practical and usable federated learning (FL) framework comes with several challenges. One of these challenges is data standardization, ensuring that data in different formats can be harmonized and made compatible with each other. Another task is the careful design of software architecture, ensuring the optimal and secure arrangement of system components. Additional tasks include incorporating extra safeguards for personal data protection to defend the system against cyberattacks targeting neural networks.

Why is the project beneficial for E-Group:

E-Group expects direct business benefits from the development of the framework, as it intends to offer federated learning on healthcare data and neural network model training as a service to various market players. Specifically, it enables the provision of appropriately prepared and carefully protected data from partner healthcare institutions, which can be used by anyone for training, testing, and updating machine learning models. If E-Group’s clients want to operationalize a new or modified machine learning model or seek complex statistical relationships, they simply upload the model to the E-Group system, specify the necessary parameters and the scope and characteristics of the data to be used. The system then executes the federated learning process and delivers the trained, ready-to-use model according to the specifications.

The collaboration with ELTE allows E-Group to leverage the university’s theoretical and practical knowledge.

In which parts of the project are ELTE team members involved:

The university researchers provide the theoretical background and partly the technical background in the areas of neural networks, machine learning, federated learning, and cryptography. The ELTE team assists in the practical implementation of the framework and the construction of the necessary software architecture. They also analyze potential threats, preventive measures, and countermeasures to ensure the system’s proper functioning, integrity, and the protection of personal data and trade secrets. Furthermore, ELTE researchers offer continuous support in the implementation of communication protocols, algorithms ensuring system security and data confidentiality, where doctoral and master’s students of the university also play a significant role.

Why did E-Group choose ELTE Faculty of Informatics as a partner:

Over the past years, E-Group and ELTE have established a fruitful, mutually beneficial, and long-term collaboration with numerous joint projects and various forms of collaboration. This project is an example of how university researchers and students can directly provide business benefits as partners to a small Hungarian company focusing on research, development, and innovation.

AI-Supported Manufacturing of MEMS Sensors

Business/Market Problem:

Micro-electromechanical systems (abbreviated as MEMS) are cutting-edge technologies widely used in modern industrial environments. These devices are tiny and contain electrical and mechanical components that enable them to sense, control, and operate at a microscopic level, generating macroscopic effects. The technology finds primary applications in industries such as automotive, healthcare, electronics, digital communication, and defense sectors. The ELTE University’s prominent industrial partner, Bosch, a global leader in engineering and technological solutions, has been at the forefront of this technology. Bosch has developed and introduced MEMS sensors and related solutions for a range of applications, including smartphones, tablets, wearable devices, augmented and virtual reality (AR/VR) glasses, drones, robots, smart homes, and the Internet of Things (IoT) (see https://www.bosch-sensortec.com/).

The manufacturing of MEMS (Micro-electromechanical systems) is an extremely complex process consisting of hundreds of steps. During this process, vast amounts of data are collected, resulting in large databases. Real-time analysis of these data poses significant challenges during manufacturing, although analysis is crucial for recognizing and predicting errors. A typical analysis may involve millions of components and thousands of different parameters. In some cases, the relationships remain hidden either due to the lack of appropriate analytical models or the absence of methods required to interpret the data. As a result, in the past, it was impossible to utilize the massive amount of complex data to maintain the failure rate of MEMS below the desired threshold of 0.1%. The most advanced solution so far has been fine-tuning the manufacturing process with experts, limiting growth due to the expensive human expertise, even in world-class facilities like Bosch’s plant in Hatvan.

Solution:

The new paradigm aims to eliminate the need for human resources by relying on statistical analysis, artificial intelligence (AI), machine learning (ML), particularly deep learning (DL), technologies for solving complex production optimization problems using data collected during sensor manufacturing. AI-based static and trainable ML algorithms allow descriptions, predictions, and management of complex behaviors with previously unattainable accuracy and reliability. Ultimately, this reduces manufacturing time, identifies anomalies, enables early error detection, and predicts downtime. Furthermore, a deeper understanding of sensor behavior is useful not only for minimizing errors during MEMS sensor manufacturing but also for developing new products and product families.

As a further step, artificial intelligence applications can be embedded directly into the MEMS sensors to describe and predict their behavior or enhance them with new functionalities to increase user satisfaction. An example of such an application is the detection of mechanical overloads, which is important as overloading can reduce the accuracy of MEMS sensors in the target devices or applications. Analyzing the behavior of numerous sensors is necessary to understand such effects, which would be challenging without automation.

Benefits for the business partner:

The application of the aforementioned analytical tools leads to more efficient industrial manufacturing processes and higher-performing product design processes in MEMS manufacturing for Bosch. This gives Bosch a competitive advantage over its rivals by reducing sensor manufacturing time and increasing the accuracy of MEMS-based products. With the resulting price reduction, the exclusive use of MEMS-based sensors becomes more feasible, for example, in autonomous driving, enabling more affordable end products and greater adoption of the technology.

A longer-term but realistic hope is that, thanks to the analytical tools and the deeper knowledge gained, (i) previously unrecognized opportunities for perfecting the manufacturing process will be revealed; (ii) new perspectives will emerge for manufacturing support on the development side; (iii) the possibility of building AI-based intelligent MEMS sensors will be realized; and (iv) a new business segment may be created for Bosch, offering the devices as services to paying customers, primarily other manufacturing companies.

Which parts of the project are/were carried out at the Faculty of Informatics, Eötvös Loránd University (ELTE):

The Faculty of Informatics at ELTE has taken on a series of crucial tasks for the project, including the optimization of supervised and unsupervised deep learning techniques used in the mentioned application areas. On a more general level, ELTE has contributed to the project’s implementation by selecting and customizing appropriate artificial intelligence solutions, developing methods that ensure transferability between different analytical tools, and actively involving multiple ELTE researchers.

Why did Bosch choose the Faculty of Informatics, ELTE, as its partner:

ELTE researchers brought a strong theoretical approach to the project, ensuring solid scientific foundations and high theoretical standards throughout the project. Additionally, the ELTE team provided specialized R&D competencies that even the highly R&D-oriented Bosch did not possess to the extent required by the project. Another significant advantage for Bosch is that the project generates a pool of students who possess practical experience and understanding of Bosch’s products and systems, not only at a conceptual level but also at a practical application level. These students can contribute to the further development of these products and systems without the need for additional training. From the ELTE perspective, the project offers students direct exposure to applicable, valuable, and market-oriented education, enhancing their value in the job market.

Additional noteworthy details:

Itilekha Podder’s personal story is worth sharing as she played a key role in the project’s conceptualization, organization, and implementation. She was involved in the project as both an ELTE doctoral student and a Bosch researcher.

Placing Service Components In A Distributed Cloud-Based Infrastructure

Business/Market Problem:

Today, virtualization – which involves multiplying the resources of physically existing hardware through software-simulated (“virtual”) computing environments – is transforming various industries, including the telecommunications sector. Telecommunication operators such as AT&T and Verizon are continuously replacing physical network components with virtual, purely software-based components. Manufacturers serving these operators, such as Ericsson, also need to adapt to this trend.

The place of dedicated hardware components is gradually being taken over by general-purpose servers, where introducing a new service simply means launching a new software program(s), replacing the time-consuming design, manufacturing, and deployment process of dedicated physical devices that used to be required for introducing a new service until recently.

At the same time, the introduction of the currently most advanced fifth-generation (5G) telecommunications networks is taking place worldwide. These networks enable supporting applications that can tolerate an extremely low level of signal transmission latency. However, supporting these latency-critical applications requires the critical parts of the applications to be located close to the users. Since telecommunication operators already have infrastructure located near users, introducing low-latency cloud services is a natural next step for them. This enables applications such as industrial robot controllers, augmented and virtual reality applications, and cutting-edge digital healthcare services to be migrated to cloud-based infrastructure.

What is still missing to utilize the physical infrastructure near users for cloud-based support of latency-critical applications is the transformation of the current centralized cloud infrastructure into distributed infrastructures. This process is by no means trivial. To launch a new service, each software component must individually select the appropriate data center based on the service requirements. These requirements can arise between different software components (e.g., concerning acceptable latency levels), within individual components (e.g., hardware support needed for component operation), or across multiple components. In a distributed system where data centers are present in the hundreds or even thousands, individually selecting the correct location would be unrealistically time-consuming, not to mention the complexity of the task and the costs of optimization.

Solution:

The goal is to create a framework that can automatically allocate resources to individual service components in a distributed cloud infrastructure. The first step is to gather information about the infrastructure’s structure and status. Since this information is stored in various specialized databases (also referred to as inventories), the framework needs to be able to connect to these inventories and extract the necessary data from them.

However, before the system is operational, the services to be launched in the future need to be designed. An expert accomplishes this by creating a template for the planned services, describing the necessary components and their relationships for launching the service. When the operator, or one of their customers directly, wants to launch a service, they select a template and complement it with specific requirements for the current case, such as the acceptable latency from a particular region or the resource requirements for the initial launch. In the developed framework, these requirements can be formulated without knowledge of the infrastructure. With the template supplemented by specific requirements, the framework allocates resources to each component of the service and plans the data flow between the components.

The framework also allows for the selection of the appropriate resource distribution based on operator-defined criteria when multiple alternative solutions exist. In practice, this means that the operator can define a ranking principle that prioritizes specific criteria, such as cost reduction or performance maximization.

Advantages for the business partner provided by the solution:

The demand for supporting latency-critical services naturally leads to the gradual replacement of centralized cloud infrastructures with distributed cloud infrastructures. The mentioned framework allows Ericsson to launch services composed of interconnected components in an automated manner on distributed cloud infrastructure, considering various requirements (e.g., latency, resource needs). Although the solution is based on a heuristic algorithm that does not guarantee finding the optimal solution in every case, the undeniable value of the framework lies in its ability to automatically solve the task in real-time (within minutes or even seconds).

What part/slice of the task does ELTE Faculty of Informatics solve:

One of the ELTE PhD students has been an integral part of the team developing the framework for the past 6 years. With several years of development experience, the team laid new foundations for the system in 2018 and has been continuously

expanding it with new functionalities. The PhD student from ELTE has played a significant role in designing and implementing the new system, as well as adding the expanding set of features.

An additional benefit of the project is that it has been discovered that the framework can serve as a foundation for solving other problems that are not directly related to the original objective but require a similar approach. The ELTE student is involved in evaluating these additional business problems. When the framework proves relevant for solving a specific problem, they help identify and develop the necessary additions that are currently missing from the system to make it applicable in the new context.

The collaboration with the ELTE Faculty of Informatics brings valuable expertise and research capabilities to the project. The PhD student’s involvement ensures that the framework benefits from the latest academic insights and methodologies, contributing to its effectiveness and innovation.

By partnering with ELTE, Ericsson gains access to a pool of talented researchers and professionals who can contribute to the ongoing development and improvement of the framework. The collaboration also provides an opportunity for knowledge exchange and mutual learning between academia and industry, fostering innovation and pushing the boundaries of cloud-based service deployments.

Overall, the collaboration between Ericsson and the ELTE Faculty of Informatics plays a crucial role in advancing the capabilities of the framework, enabling the automated deployment of services on distributed cloud infrastructure with consideration for specific requirements and resource allocation.

Scroll to Top
Skip to content