28/11 Shabolovka ul.,
Building 4, Room 1203
Tel/fax: +7(495) 772-9590 * 26311, 26034
This book gathers the papers on digitalization of society, economics and management in post-pandemic period. It shares the latest insights into various aspects of the digitalization of the economy and the consequences of transformation in public administration, business and public life. Integrating a broad range of analytical perspectives, including economic, social and, technological, this interdisciplinary book is particularly relevant for scientists, digital technology users, companies and public institutions.
Strategic transformation and logistic integration in supply chain management requires systematic strategic supply chain modeling, and modern simulation provides such opportunities for analysis and synthesis of efficient and integrated supply chains. Authors suggest a method of constructing and analysis of conceptual supply chain models. The following base levels of the supply chain representation are considered: object-based, configuration/network-based, process-based, and logistics coordination levels. A general simulation model of integrative supply chain is proposed based on technologies of hybrid process-and-agent-based simulation modeling. Literature review on simulation modeling application for integrative and collaborative supply chains is presented. Iterative simulation and optimization procedures for complex analysis and optimization of supply chains are proposed. The suggested approaches and techniques were tested in the case of strategic transformation of supply chain. Authors present and interpret the results of supply chain optimization и simulation modeling for a set of scenarios of logistic processes transformation and inventory management policies, interorganizational coordination mechanisms and related technological solutions.
properly built risk assessment process could help to significantly
reduce the overall level of a project uncertainty, which in turn will have a positive
impact on the project outcome. Based on recommendation given in BABOK®
Guide, a combined procedure for analysis of risks is built up, which allows performing
risk assessment within the framework of the overall risk management
process. The main groups of risks classified by their main origin are identified
related to a reference IT-project applied in financial sphere. This makes the proposed
procedure to be aimed at detailed risk analysiswhere not only the qualitative
but also the quantitative measure of the risks, i.e., their probability and gravity of
their consequences can be implemented. The process of risk assessment is chosen
for the project consisting in creating and deploying a modern corporate data
warehouse in a big Russian private bank. The procedure is extended by decision
tree, concisely illustrating risk decomposition, which open the way to highly predictable
further risk management process. The short feedback replies given by
main stakeholders at the end of first stage are also presented.
With technology evolving rapidly and proliferating, it is imperative to pay attention to mobile devices’ security being currently responsible for various sensitive data processing. This phase is essential as an intermediate before the cloud or distributed ledger storage delivery and should be considered additional care due to its inevitability. This paper analyzes the security mechanisms applied for internal use in the Android OS and the communication between the Android OS and the remote server. Presented work aims to examine these mechanisms and evaluate which cryptographic methods and procedures are most advantageous in terms of energy efficiency derived from execution time. Nonetheless, the dataset with the measurements collected from 17 mobile devices and the code for reproducibility is also provided. After analyzing the collected data, specific cryptographic algorithms are recommended to implement an application that utilizes native cryptographic operations on modern Android devices. In particular, selected algorithms for symmetric encryption are AES256 / GCM / No Padding; for digital signature – SHA512 with RSA2048 / PSS, and for asymmetric encryption – RSA3072 / OAEP with SHA512 and MGF1 Padding.
−Albert Abubakirovich Galeev is a Soviet and Russian expert in plasma physics who actively contributed to fusion research. In the early 1970s, he became a head of department at the Space Research Institute of the Academy of Sciences of USSR and began devoting most of his time to the problems of the physics of space plasma and made a very important contribution to the solution of many of them, such as physics of collisionless shock waves, the phenomenon of anomalous ionization, processes in the plasma envelopes of comets, and many others. This paper is devoted to only one of the many directions of his work: studies of current sheets and the magnetic reconnection processes that occur therein. Studies of thin current structures is space plasma, whose thickness is about the proton gyroradius, began with the pioneering works of S.I. Syrovatskii, T. Speiser, and other outstanding scientists who proposed that in space plasma, thin boundary current sheets exist, which play the key role in the dynamics of Earth’s magnetosphere and Sun’s corona. The development of these works was dictated by the necessity to explain the solar flares and magnetospheric perturbations during which phases of evolutionary development are replaced by explosive spontaneous processes that release free energy. One of the key physical processes is the magnetic field reconnection, which is realized in nature as a part of the general problem of generation and evolution of current sheets. In a series of works that started in 1975 by the publication (together with L.M. Zelenyi) of the article entitled “Metastable states of a diffuse neutral layer” in JETP letters, A.A. Galeev studied the stability of current sheets to the tearing mode and the dynamics of magnetic reconnection at the boundary of planetary magnetospheres and explained the processes of generation of fast ion flows with energies of several MeV in Earth’s magnetotail. In this paper, we discuss further development of these works that were once initiated by A.A. Galeev. A new model of embedded current sheets is presented, which consists of an internal electron sheet and two external current sheets formed by proton and oxygen ion currents. It is shown that the free energy of such embedded structure in the corresponding range of parameters substantially exceeds the free energy of the well-known Harris’s configuration. This allows one to simultaneously explain their stability (up to a certain limit) and destabilization when the current sheet parameters reach certain critical values, which leads to the change of topology of magnetic field and start of magnetic reconnection.
We analyze how artificial intelligence changes a significant part of the energy sector, the oil and gas industry. We focus on the upstream segment as the most capital-intensive part of oil and gas and the segment of enormous uncertainties to tackle. Basing on the analysis of AI application possibilities and the review of existing applications, we outline the most recent trends in developing AI-based tools and identify their effects on accelerating and de-risking processes in the industry. We investigate AI approaches and algorithms, as well as the role and availability of data in the segment. Further, we discuss the main non-technical challenges that prevent the intensive application of artificial intelligence in the oil and gas industry, related to data, people, and new forms of collaboration. We also outline three possible scenarios of how artificial intelligence will develop in the oil and gas industry and how it may change it in the future (in 5, 10, and 20 years).
Currently, wireless body area networks (WBANs) are developing rapidly. Miniature wearable devices which are light weight and low power consumption can be developed using modern technology. However, the WBANs research challenge is power consumption. This paper provides the two intrabody communication (IBC) methods review. The equivalent body channel models and modeling methods are analyzed. The conventional IBC WBANs architecture is reviewed and novel energy-efficient architecture and an IBC WBANs data transmission technique are proposed. The experiment was conducted in order to measure the BodyCom mobile module output power. The comparative analysis showed that the power consumption of the IBC proposed technique is 7 times lower than Bluetooth LE, and 14 times lower than ZigBee. The proposed technique can be applied in areas such as home and industrial automation, medical appliances, wearable devices, security, and internet of things.
A trapdoor cipher is a cipher whose algorithm contains some hidden structure (a trapdoor) providing the existence of a subliminal information channel. In cryptographic practice, there could be situations when a constructed cipher may contain some critical defect (a trapdoor) whose identification can significantly weaken the cryptographic strength of this cipher. In this paper, we propose and analyze one of such defects in terms of automata-theoretic approach. An operation of the cipher with this defect is modeled by a finite automaton under the so-called effective observation. The existence of effective observation for a finite automaton qualitatively reflects the presence of a trapdoor which allows one to determine the information on automaton input words by observations over the corresponding output words. We prove the criterion of finding an automaton under effective observation and specify the classes of automata under effective observation and the classes of automata for which there is no effective observation. Possible applications of the results for protecting ciphers from side channel attacks are formulated.
The goal of the paper is to develop a new algorithm for predicting whether the company will go bankrupt on the base of unbalanced data. To do it, we propose to consider the classification as a multi-objective optimization problem and construct a prediction model as an ensemble while minimizing the parameters FPR (False Positive Rate) and FNR (False Negative Rate) at the same time. To create the ensemble, the proposed algorithm of a Multi-Objective Classifier Selection (MOCS) selects only classifiers that belong to the Pareto-optimal set in FPR/FNR space; that is, there is no dominance between them, and they satisfy some additional conditions. In the general case, MOCS is determined by three parameters: two threshold values that limit false rates (FNR and FPR), and the crowding distance, which defines the uniqueness of the classifier's results. We tested the proposed algorithm on data collected from 2457 Russian companies, 456 of which went bankrupt, and 5910 Polish companies, 410 of which received bankruptcy status. Datasets contain features such as financial ratios and business environment factors. In the testing, we used more than 70 combinations of under-sampling, over-sampling, and no sampling methods with static and dynamic classification models. Final ensembles include seven classifiers for the Russian dataset and four classifiers for the Polish dataset combined by soft voting rule. In both cases, the proposed algorithm produces a significant improvement of prediction results as in terms of standard metrics (geometric mean, the area under the ROC curve) and in the visual representation in the FNR/FPR space, namely in the shift from a Pareto-optimal set of classifiers.
The growing interest and expectations from the blockchain applica-tions attract many analysts to this issue. In what spheres of logistics and supply chain management blockchain is appropriate? What blockchain software solutions are available to companies now? This paper investigates the basic function-ality of the existing software solutions on the market, the comparative analysis of blockchain platforms used for developing the solutions for logistics is also carried out. The main trends of blockchain applications are identified, based on the analysis of the project experience on the use of blockchain, in logistics and supply chain management, in different countries. The problems, limitations and conditions of blockchain implementation are also determined.
Extract-transform-load (ETL) processes play a crucial role in data analysis in real-time datawarehouse environments which demand lowlatency and high availability features for functionality. In essence, ETL- processes are becoming bottlenecks in such environments due to complexity growth, number of steps in data transformations, number of machines used for data processing and finally, increasing impact of human factors on development of new ETL-processes. In order to mitigate this impact and provide resilience of the ETL process, a special Metadata Framework is needed that can manage the design of new data pipelines and processes. In this work, we focus on ETL metadata and its use in driving process execution and present a proprietary approach to the design of the metadata-based process control that can reduce complexity, enhance resilience of ETL processes and allowtheir adaptive self-reorganization.We present a metadata framework implementation which is based on open-source Big Data technologies, describing its architecture and interconnections with external systems, data model, functions, quality metrics, and templates. A test execution of an experimental Airflow Directed Acyclic Graph (DAG) with randomly selected data is performed to evaluate the proposed framework.
Our interest here lies in supporting important, but routine and time-consuming activities that underpin success in highly distributed, collaborative design and manufacturing environments; and how information structuring can facilitate this. To that end, we present a simple, yet powerful approach to team formation, partner selection, scheduling and communication that employs a different approach to the task of matching candidates to opportunities or partners to requirements (matchmaking): traditionally, this is approached using either an idea of ‘nearness’ or ‘best fit’ (metric-based paradigms); or by finding a subtree within a tree (data structure) (tree traversal). Instead, we prefer concept lattices to establish notions of ‘inclusion’ or ‘membership’: essentially, a topological paradigm. While our approach is substantive, it can be used alongside traditional approaches and in this way one could harness the strengths of multiple paradigms.
As an exogenous antecedent of national innovation performance, culture has been receiving significant attention in cross-cultural research. However, relying primarily on Hofstede’s framework of national culture, this research has so far been predominantly inclined to treating culture as a collection of independent dimensions, thereby ignoring the complex notion of culture profiles that refer to distinctive patterns of interrelated dimensions, which cannot be considered in isolation, but only in combination. Employing the lens of neo-configurational theory and with the support of the fuzzy-set Qualitative Comparative Analysis (fsQCA), the present study aims to fill this gap by exploring how multiple Hofstede’s dimensions interact and combine to influence national innovation performance. In this way, this study goes beyond the existing theory and empirical evidence about the relationship between distinctive culture profiles and innovation performance at national level, while broadening our understanding more generally about how to conceptualize and operationalize culture in business research.
Recently, transfer learning from pre-trained language models has proven to be effective in a variety of natural language processing tasks, including sentiment analysis. This paper aims at identifying deep transfer learning baselines for sentiment analysis in Russian. Firstly, we identified the most used publicly available sentiment analysis datasets in Russian and recent language models which officially support the Russian language. Secondly, we fine-tuned Multilingual Bidirectional Encoder Representations from Transformers (BERT), RuBERT, and two versions of the Multilingual Universal Sentence Encoder and obtained strong, or even new, stateof-the-art results on seven sentiment datasets in Russian: SentRuEval-2016, SentiRuEval-2015, RuTweetCorp, RuSentiment, LINIS Crowd, and Kaggle Russian News Dataset, and RuReviews. Lastly, we made fine-tuned models publicly available for the research community.
One of the most promising enablers for the secure distributed operation of the Internet of Things (IoT) systems could be based on a mathematical construct widely known as blockchain that aims to neglect the system’s centralization and scalability properties. This paper aims to map the requirements and features of both systems, highlight the main co-existence challenges and technological candidates for smoother integration of IoT and blockchain, as well as provide the standartization outlook. Moreover, an architectural approach to an integrated solution is identified based on classic literature review methodology aiming to consider the IoT versus blockchain characteristics mapping and outlining related challenges. Critical solutions to address the integration bottlenecks include moving from Proof-of-Work (PoW) to Distributed Proof-of-Stake (DPoS) consensus, adding a Fog overlay to the architecture model, and leveraging the synergies combining the benefits of blockchain and IoT technology are highlighted
The architectural approach allows you to achieve the reuse of knowledge in the development of a digital platform through the use of modelling patterns, which is especially important in the conditions of digital platforms spread and the enterprises’ needs in obtaining the competitive advantages that digital platforms provide to their owners. This article deals with questions of multi-level conceptual models formation, considering the subject area of the organization, which are further used as a conceptual basis for designing a digital platform. The possibility of using three levels of modelling abstraction including meta-metamodel, metamodel, and directly the enterprise architecture model of a digital platform, is considered in detail. Demonstrated practical experience of applying modelling patterns for a large service company digital platform development, which provides services to transport companies, suppliers of sensors and equipment, as well as delivery customers.
At present, many manufacturing companies face with the problem of setting their production plans taking into account existing liming factors that can be of both internal and external nature. This task is known as determining the product mix – the scope and volumes of products that should be manufactured and sold. Solution of this task is a decision making process: it is necessary to select an optimal (regarding one or several criteria) combination of products and their production volumes. The paper focuses on two approaches dealing with the product mix task. One of the approaches is based on management accounting calculations: the product mix is selected to maximize profitability, by ranking the products according to their contribution-earning ability per unit of the limiting factor. Certain limitations of the management accounting approach are the problems with forecasted assumptions, using only financial considerations, ignoring the degree of preference of one product in comparison with another, as well as missing information about possible states of the external environment. Another approach relies on multi-criteria decision making methods and expert estimates. This approach allows decision makers to take into account non-financial factors, including qualitative information, as well as to consider the power of distinction between products and environmental aspects. In the paper it is argued that combination of the two approaches may be used within common task of determining the product mix. The combined approach supported by appropriate information systems makes the decision making process more efficient and justifiable.
In the modern quickly changing business environment the role of the project management continuously increases in activity of the companies. Since 2014 methodical recommendations about introduction of project management in executive authorities were developed. It is established that mistakes at implementation of projects lead to multimillion losses. It doesn't remain unaddressed.
When performing any work of the project there can be a mistake, which is also a deviation from the established requirements. Verification of compliance of the final product to the established requirements is an obligatory stage of implementation of any project. If mistakes in works are revealed only at this stage, then a lot of time for their localization will be required. Localization here will be detection of the works performed incorrectly. In practice the similar situation almost inevitably leads to failure to meet time constraints of end of the project and the corresponding material losses. Therefore, it is necessary to carry out checks of correctness of performance of work during implementation of the project. If to carry out check of compliance of result to all established requirements for each work and to correct errors, then compliance of the final product of the project to all to the established requirements will be guaranteed, but at the same time terms of the project and its cost can increase to unacceptable values. The problem of the choice of control points is relevant as its decision allows to reduce losses from a delay of the termination of the project caused by mistakes in works. These losses are directly bound to time of restoring a regularity of implementation of the project, and for VRP ratio is non-linear and losses quickly increase. In certain cases, such ratio is exponential.
The rational choice of control points is an effective method of decrease in time of restoring a regularity of implementation of the project when mistakes in works appeared.
Several methods of arrangement of control points are offered. They are based on the principle of maximization of the information. The diagnostic model of the project and the mathematical model of emergence of mistakes in works of the project are developed for using these methods. The criteria for comparison of options of arrangement of control points is developed as well as the algorithms and the software for the experimental check of effectiveness of the offered methods and their practical application to arrangement of control points in real projects. The experiment directed to assessment of effectiveness of each of the offered methods is made. Results of an experiment showed expediency of use of all methods as a part of a complex and its high performance.
Lecture Notes in Information Systems and Organization—LNISO—is a series of
scientific books that explore the current scenario of information systems, in
particular IS and organization. The focus on the relationship between IT, IS and
organization is the common thread of this collection, which aspires to provide
scholars across the world with a point of reference and comparison in the study and
research of information systems and organization. LNISO is the publication forum
for the community of scholars investigating behavioral and design aspects of IS and
organization. The series offers an integrated publication platform for high-quality
conferences, symposia and workshops in this field. Materials are published upon a
strictly controlled double blind peer review evaluation made by selected reviewers.
LNISO is abstracted/indexed in Scopus
More information about this series at http://www.springer.com/series/11237
Acceleration of single‐ and multi‐charged oxygen ions in the perturbed Earth's magnetotail is investigated as the possible source of energetic heavy ions in the ring current. The numerical model is developed that allows evaluating the acceleration of oxygen ions O+‐O+8 in two possible scenarios of characteristic perturbations: (A) passage of multiple dipolarization fronts in the magnetotail; (B) passage of fronts followed by electromagnetic turbulence. It is shown that acceleration processes depend on particle charges as well as characteristic time scales of induced electric field variations. Maximum energies gained by oxygen ions correlate with values of their charges. Our simulations show that all kinds of single‐ and multiply charged heavy particles can be efficiently accelerated during multiple dipolarizations processes of the type (A) from initial energies 12 keV to maximum energies about several MeV. The gain of energies of heavy ions under the (B) scenario of magnetospheric perturbations is about 10% higher than in (A) scenario. The shapes of obtained in the model energy spectra were shown to be in agreement with experimental spectra in the range of L‐shells corresponding to ring/radiation belts. Therefore we conclude that the Earth's magnetotail can play the role of the depot where oxygen ions of both ionospheric and solar wind origin can be effectively accelerated during magnetic substorms to energies about several MeV and then populate the ring current and radiation belts of the Earth.