Robert W. Harrison is an emeritus professor of computer science at Georgia State University. His research background is in scientific computing and bioinformatics. He has published 256 reviewed papers and proceedings in areas ranging from molecular modelling and simulation to inverse problems and machine learning. His first employment after receiving his PhD from Yale University in 1985 in Molecular Biophysics was as a programmer for Intelsat working on convolutional codes and the simulation of satellite communication channels, which demonstrates the wide breadth of his interests and skills. Since then he has been at NBS (NIST), Frederick Cancer Research and Development (NCI), Thomas Jefferson University and for the last 23 years at Georgia State University. His most recent research has been studying imprecise or “Fuzzy” machine learning and efficient implementations using restricted Boltzmann machines. His highly efficient algorithms for these machines are two to three thousand times faster than conventional algorithms on test data sets. He has developed algorithms for continuous Boltzmann machines that exhibit quadratic convergence in the worst cases. Implementations of these algorithms can be applied to recognition of time-series and image data.
The Importance of Being Fuzzy
Abstract
Mathematical logic and Boolean logic deal with crisp certainty. A value is either true or it is false. There is nothing between those values. This makes logic easy, or at least relatively easy. The real world is imprecise. Experimental errors or uncertainties and less than rigorous categories lead to uncertainties in the data. Fitting these uncertainties as if they were crisp leads to overtraining, overly complex models, and fragile results.
Fuzzy methods directly address this problem by assigning a likelihood or belief to data and classifications. Beliefs are a somewhat relaxed version of probabilities where normalization and other nice proper mathematical formalisms are ignored. They keep just enough rigor to be useful. In essence, fuzzy methods trade nominal (overestimated) precision for the ability to give the support of the conclusion from the data and model. My group has worked on integrating fuzzy methods with machine learning. We started with fuzzy decision trees where we were concerned with both computational efficiency (for big data) and the ability to resolve limited sets of meaningful decision rules. Our recent work has focused on developing restricted Boltzmann machines as fuzzy learners. The rather simple algorithms can be implemented as an “adaptor” layer that can handle raw data and present it in robust manner to other machine learners.
Boris Koldehofe is a Full Professor at the Technical University of Ilmenau, leading the Distributed and Operating Systems Group at the department of Computer Science and Automation. He received a Ph.D. degree from Chalmers University of Technology, Gothenburg, Sweden, in 2005. Since then, he has worked in the field of distributed and network-centric computing systems at the EPFL (PostDoc), the University of Stuttgart, the Technical University of Darmstadt (Senior researcher and lecturer), and the University of Groningen (Full professor). He has a long-standing interest in event-based and stream processing systems, covering issues related to scalability, performance, mobility, reliability, and security. His current research focuses complementary on software-defined networks, adaptive communication middleware, distributed in-network computing, and energy-efficient computing. He has contributed to more than 150 scientific publications in major journals, e.g., the IEEE Transactions on Networking (ToN) and the IEEE Transactions on Parallel and Distributed Systems, and conferences, e.g., the ACM/USENIX Middleware and the ACM DEBS conferences. He has also served as a Tutorial Speaker for the ACM/USENIX Middleware, ACM DEBS, GI, and NetSys conferences.
Adaptive and Net-centric Data Stream Processing
ABSTRACT
The performance of data stream processing systems heavily relies on the ability to move data between stream processing operators efficiently. The softwarization of computer networks offers a huge potential for distributed systems to accelerate the performance of distributed stream processing operators by minimizing data movements and accelerating the execution of operators. Yet, using methods of in- network computing to accelerate middleware services like stream processing systems often conflict with the famous end-to-end principle. Therefore, in this talk, we will focus on abstractions that allow adapting and executing computations on heterogeneous resources of network elements and discuss how these abstractions can support stream processing systems. In particular, we highlight and introduce recent findings in distributed data stream processing, network function virtualization, and real-time packet streaming. We show how different paradigms and programming models support accelerating performance by better utilizing the capabilities of in-network computing elements. Moreover, we give an outlook on how future developments can change how distributed computing can be adaptively performed over networked infrastructures.
Chalermpol Tapsai, is a Dr.-Ing. From the University of Hagen, Germany, and a Ph.D. in Information Technology from King Mongkut's University of Technology North Bangkok. He also holds an M.Sc. in Applied Statistics from the National Institute of Development Administration and a B.Sc. in Agriculture from
Kasetsart University. Tapsai has served as Vice Director for Research and Innovative e-Learning and Vice Dean for Student Affairs at Suan Sunandha Rajabhat University. His research spans natural language processing, data security, and e-learning. He has presented at international conferences and authored over 50 books on computer and information technology.
Artificial Emotion: the research on making machines more human-like
ABSTRACT
Emotion is a psychological mechanism that occurs inside humans. Emotions originate through a complex process involving sensory systems, interpretation, and summarization that is linked to knowledge and experiences that are different for each person. The emergence of emotions results in different responses that are sometimes unexplainable. It is an important factor that indicates the difference between humans and machines and is an important factor that affects or motivates humans to act, not act, or cancel an action.
Currently, most research related to artificial intelligence focuses on the use of techniques such as Machine Learning, Neural Network, Deep Learning, etc., to enable machines to learn data in various forms, especially the research on Large Language Models, which are currently very popular, focuses on enabling machines to produce results, find answers, assist with work, or respond to human needs in a variety of ways, including searching for information, negotiating, editing images, drawing, editing videos, composing music, creating content, etc. Most of the existing studies focused on the application of machines to assist humans in a machine-like manner rather than making machines human-like. There are very few studies that are likely to lead to machines being more human-like, such as analyzing human emotions from facial expressions and sentiment analysis from texts or documents. Although many studies have been conducted for a long time, the results are still far from making machines human-like. The reason for this may be because this type of research is related to analyzing human emotions and feelings, which are complex and involve many variables and factors. There are many related theories, both psychological and social theories. Therefore, understanding these theories, related technology, and data processing techniques will help researchers plan and design their research on the Artificial Emotion to achieve the goal of developing machines to be more human-like.
This site was created with the Nicepage