keyboard_arrow_up
Top AI Research articles of 2019

CHINESE & IRANIAN ARTIFICIAL INTELLIGENCE IN LOW EARTH ORBIT TO THAAD SPACE WARS

    Rory Lewis ,Department of Computer Science University of Colorado Colorado Springs, USA

    ABSTRACT

    This paper addresses what the role of artificial intelligence will be in low earth orbit (LEO) and space war is and specifically, what China and Iran’s latest research in artificial intelligence on unmanned aerial vehicles (UAVs) for LEO space war domains has been, is, and strives to become. The author first presents testimony from scholars and space research scientists from many countries who all categorically state, without a trace of doubt that all future space warfare will be in satellites and unmanned UAVs and how they in turn will rely heavily on artificial intelligence. This paper includes an analysis on China’s strengths in space artificial intelligence and, Iran’s systematic approach or arming its UAVs. The second portion of this research drills down into what are the specific mathematical theoretical research areas of artificial research for LEO space wars in various countries, including China. The author concludes with research strategies that will combat China’s dominance of space wars.

    KEYWORDS

    Twitter; sentiment analysis; world clouds; text mining; information retrieval


    For More Details :
    https://aircconline.com/csit/papers/vol9/csit91406.pdf


    5th International Conference on Computer Science and Information Technology (CSTY 2019)


AN ARTIFICIAL NEURAL NETWORK APPROACH FOR THE CLASSIFICATION OF HUMAN LOWER BACK PAIN

    Shubham Sharma and Rene V.Mayorga ,University of Regina, Canada

    ABSTRACT

    In today’s world, the problem of lower back pain is one of the fastest growing crucial ailments to deal with. More than half of total population on the earth, suffers from it at least once in a lifetime. Human Lower Back Pain symptoms are commonly categorized as Normal or Abnormal. In order to remedy Human Lower Back Pain, with the growth of technology over the time, many medical methods have been developed to diagnose and cure this pain at its earliest stage possible. This study aims to develop two Machine Learning (M.L.) models which can classify Human Lower Back Pain symptoms in a human body using non-conventional techniques such as Feed forward/Back propagation Artificial Neural Networks, and Fully Connected Deep Networks. An Automatic Feature Engineering technique is implemented to extract featured data used for the classification. The proposed models are compared with respect to a Support Vector Machine model; considering different performance parameters.

    KEYWORDS

    Machine Learning, Artificial Neural Networks, Fully Connected Deep Networks, Support Vector Machine, Lower Back Pain, Automatic Feature Engineering technique.


    For More Details :
    https://aircconline.com/csit/papers/vol9/csit91313.pdf


    6th International Conference on Computer Science, Engineering and Information Technology (CSEIT-2019)


CONSTRUCTION OF AN ORAL CANCER AUTOCLASSIFY SYSTEM BASED ON MACHINELEARNING FOR ARTIFICIAL INTELLIGENCE

    Meng-Jia Lian1, Chih-Ling Huang2, Tzer-Min Lee1,3 1 School of Dentistry, Kaohsiung Medical University, Taiwan , 2Kaohsiung Medical University, Kaohsiung, Taiwan 3 National Cheng Kung University Medical College, Taiwan

    ABSTRACT

    Oral cancer is one of the most widespread tumors of the head and neck region. An earlier diagnosis can help dentist getting a better therapy plan, giving patients a better treatment and the reliable techniques for detecting oral cancer cells are urgently required. This study proposes an optic and automation method using reflection images obtained with scanned laser pico-projection system, and Gray-Level Co-occurrence Matrix for sampling. Moreover, the artificial intelligence technology, Support Vector Machine, was used to classify samples. Normal Oral Keratinocyte and dysplastic oral keratinocyte were simulating the evolvement of cancer to be classified. The accuracy in distinguishing two cells has reached 85.22%. Compared to existing diagnosis methods, the proposed method possesses many advantages, including a lower cost, a larger sample size, an instant, a non-invasive, and a more reliable diagnostic performance. As a result, it provides a highly promising solution for the early diagnosis of oral squamous carcinoma.

    KEYWORDS

    Oral Cancer Cell, Normal Oral Keratinocyte (NOK), Dysplastic oral keratinocyte (DOK),GrayLevel Co-occurrence Matrix (GLCM), Scanned Laser Pico-Projection (SLPP), Support Vector Machine (SVM), Machine-Learning.


    For More Details :
    https://aircconline.com/csit/papers/vol9/csit90903.pdf


    9th International Conference on Computer Science, Engineering and Applications (CCSEA 2019)


EVOLUTIONARY ALGORITHMS TO SIMULATE REAL CONDITIONS IN ARTIFICIAL INTELLIGENCE AS BASIS FOR MATHEMATICAL FUZZY CLUSTERING

    Ness, S. C. C Evocell Institute, Austria

    ABSTRACT

    In present-day physics we may assume space as a perfect continuum describable by discrete mathematics or a set of discrete elements described by a programmed probabilistic process or find alternative models that grasp real conditions better as they more closely simulate real behaviour. Clustering logic based on evolutionary algorithms is able to give meaning to unlimited amounts of data that enterprises generate and that contain valuable hidden knowledge. Evolutionary algorithms are useful to make sense of this hidden knowledge, as they are very close to nature and the mind. However, most known applications of evolutionary algorithms cluster data points to one group, thereby leaving key aspects to understand the data out and thus hardening simulations of biological processes. Fuzzy clustering methods divide data points into groups based on item similarity and detects patterns between items in a set, whereby data points can belong to more than one group. Evolutionary algorithm fuzzy clustering inspired multivariate mechanism allows for changes at each iteration of the algorithm and improves performance from one feature to another and from one cluster to another. It is applicable to real life objects that are neither circular nor elliptical and thereby allows for clusters of any predefined shape. In this paper we explain the philosophical concept of evolutionary algorithms for production of fuzzy clustering methods that produce good quality of clustering in the fields of virtual reality, augmented reality and gaming applications and in industrial manufacturing, robotic assistants, product development, law and forensics as well as parameterless body model extraction from CCTV camera images.

    KEYWORDS

    Artificial Evolution, Artificial Intelligence, Biology, Big Data, Cellular Automata, Data Interpretation and Analytics, Deep Learning, Features Selection, Genetic Algorithms, Generative Models, Machine Learning, Pattern Recognition, Robotic Process Automation,Simulation, Smart Systems, Virtual Machines, Visualization.


    For More Details :
    https://aircconline.com/csit/papers/vol9/csit90501.pdf


    7th International Conference on Computational Science and Engineering (CSE)


SMARTGRAPH: AN ARTIFICIALLY INTELLIGENT GRAPH DATABASE

    Hal Cooper1, Garud Iyengar1, and Ching-Yung Lin2 1Department of Industrial Engineering and Operations Research 2Department of Electrical Engineering, Columbia University, USA

    ABSTRACT

    Graph databases and distributed graph computing systems have traditionally abstracted the design and execution of algorithms by encouraging users to take the perspective of lone graph objects, like vertices and edges. In this paper, we introduce the SmartGraph, a graph database that instead relies upon thinking like a smarter device often found in real-life computer networks, the router. Unlike existing methodologies that work at the subgraph level, the SmartGraph is implemented as a network of artificially intelligent Communicating Sequential Processes. The primary goal of this design is to give each ``router” a large degree of autonomy. We demonstrate how this design facilitates the formulation and solution of an optimization problem which we refer to as the “router representation problem”, wherein each router selects a beneficial graph data structure according to its individual requirements (including its local data structure, and the operations requested of it). We demonstrate a solution to the router representation problem wherein the combinatorial global optimization problem with exponential complexity is reduced to a series of linear problems locally solvable by each AI router.

    KEYWORDS

    Intelligent Information, Database Systems, Graph Computing.


    For More Details :
    https://airccj.org/CSCP/vol9/csit90307.pdf


    7th International Conference on Signal Image Processing and Multimedia (SIPM 2019)


MAGNETIC ANOMALIES DUE TO 2-D CYLINDRICAL STRUCTURES - AN ARTIFICIAL NEURAL NETWORK BASED INVERSION

    Bhagwan Das Mamidala1 and Sundararajan Narasimman2 , 1Osmania University, India 2Sultan Qaboos University, Oman

    ABSTRACT

    Application of Artificial Neural Network Committee Machine (ANNCM) for the inversion of magnetic anomalies caused by a long-2D horizontal circular cylinder is presented. Although, the subsurface targets are of arbitrary shape, they are assumed to be regular geometrical shape for convenience of mathematical analysis. ANNCM inversion extract the parameters of the causative subsurface targets include depth to the centre of the cylinder (Z), the inclination of magnetic vector(Ɵ) and the constant term (A) comprising the radius(R) and the intensity of the magnetic field (I). The method of inversion is demonstrated over a theoretical model with and without random noise in order to study the effect of noise on the technique and then extended to real field data. It is noted that the method under discussion ensures fairly accurate results even in the presence of noise. ANNCM analysis of vertical magnetic anomaly near Karimnagar, Telangana, India, has shown satisfactory results in comparison with other inversion techniques that are in vogue.

    KEYWORDS

    Magnetic anomaly, Artificial Neural Network, Committee machine, Levenberg – Marquardt algorithm, Hilbert transform, modified Hilbert transform.


    For More Details :
    https://airccj.org/CSCP/vol9/csit90105.pdf


    3rd International Conference on Computer Science and Information Technology (COMIT 2019)


MACHINE LEARNING MODEL TO PREDICT BIRTH WEIGHT OF NEW BORN USING TENSORFLOW

    S.Karthiga, K.Indira and C.V.Nisha Angeline , Thiagrajar College of Engineering,Madurai

    ABSTRACT

    Low Birth Weight is the major problem for the new born. Low birth weight is a term used to describe babies who are born weighing less than 5 pounds, 8 ounces (2,500 grams). Low-birth weight babies are more likely than babies with normal weight to have health problems as a newborn. Almost 40 percent of the new born suffer from underweight. Predicting birth weight before the birth of the baby is the best way to help the baby get special care as early as possible. It helps us to arrange for doctors and special facilities before the baby is born. There are several factors that affect the birth weight. Through past studies, it has been observed that the factors which affect the child birth range from biological characteristics like the baby's sex, race, age of mother and father, weight gained by the mother during pregnancy to behavioral characteristics like smoking and drinking habits of the mother, the education and living conditions of the parents. This project focuses on developing a web application that predicts baby weight taking baby’s gender, plurality, gestation weeks and mothers age as inputs. Machine learning is one of the domains that plays important role in medical industry. Many machine learning models have been developed to predict diseases at the early stage. In this project wide and deep neural network model is developed using TensorFlow library in Google cloud environment. Wide and Deep Neural Network combines wide linear model and deep neural network. It provides both memorization and generalization. Pre-processing and training is done in the distributed environment using cloud Dataflow and Cloud ML Engine. The model is then deployed as REST API.A web application is developed to invoke the API with the user inputs and show the predicted baby weight to the users. It is scalable and provides high performance.

    KEYWORDS

    scalability, high performance, availability, maintainability, repeatability, abstraction and testability


    For More Details :
    https://aircconline.com/csit/papers/vol9/csit91506.pdf


    First International Conference on Secure Reconfigurable Architectures & Intelligent Computing (SRAIC 2019)


PREDICTION OF WORKPIECE QUALITY: AN APPLICATION OF MACHINE LEARNING IN MANUFACTURING INDUSTRY

    Günther Schuh1 , Paul Scholz2 , Sebastian Schorr3 , Durmus Harman2 , Matthias Möller4 , Jörg Heib4 , Dirk Bähre3 1,2RWTH Aachen University Aachen, Germany , 3Saarland University, Saarbrücken, Germany 4Bosch Rexroth AG,Germany

    ABSTRACT

    A significant amount of data is generatedand could be utilized in order to improve quality, time, and cost related performance characteristics of the production process. Machine Learning (ML) is considered as a particularly effective method of data processing with the aim of generating usable knowledge from data and therefore becomes increasingly relevant in manufacturing. In this research paper, a technology framework is created that supports solution providers in the development and deployment process of ML applications. This framework is subsequently successfully employed in the development of an ML application for quality prediction in a machining process of Bosch Rexroth AG.For this purpose the 50 mostrelevant features were extracted out of time series data and used to determine the best ML operation. Extra Tree Regressor (XT) is found to achieve precise predictions with a coefficient of determination (R 2 ) of constantly over 91% for the considered quality characteristics of a boreof hydraulic valves.

    KEYWORDS

    Technology Management Framework, Quality Prediction, Machine Learning, Manufacturing, Workpiece Quality


    For More Details :
    https://aircconline.com/csit/papers/vol9/csit91316.pdf


    6th International Conference on Computer Science, Engineering and Information Technology (CSEIT-2019)


INCLUDING NATURAL LANGUAGE PROCESSING AND MACHINE LEARNING INTO INFORMATION RETRIEVAL

    Piotr Malak and Artur Ogurek University of Wrocław, Poland

    ABSTRACT

    In current paper we discuss the results of preliminary, but promising, research on including some Natural Language Processing (NLP) and Machine Learning (ML) approaches into Information Retrieval. Classical IR uses indexing and term weighting in order to increase pertinence of answers given to users queries. Such approach allows for matching the meaning, i.e. matching all keywords of the same or very similar meaning as expressed in user query. For most cases this approach is sufficient enough to fulfil user information needs. However indexing and retrieving information over professional language texts brings new challenges as well as new possibilities. One of challenges is different grammar, causing the need of adjusting NLP tools for a given professiolect. One of the possibilities is detecting the context of occurrence of indexed term in the text. In our research we made an attempt to answer the question whether Natural Language Processing approach combined with supervised Machine Learning is capable of detecting contextual features of professional language texts.

    KEYWORDS

    Enhanced Information Retrieval, Contextual IR, NLP, Machine Learning.


    For More Details :
    https://aircconline.com/csit/papers/vol9/csit91202.pdf


    8th International Conference on Natural Language Processing (NLP 2019)


A MACHINE LEARNING ALGORITHM IN AUTOMATED TEXT CATEGORIZATION OF LEGACY ARCHIVES

    1Dali Wang,2Ying Bai and 1David Hamblin 1Christopher Newport University, USA 2Johnson C. Smith University, USA

    ABSTRACT

    The goal of this research is to develop an algorithm to automatically retrieve critical information from raw data files in NASA’s airborne measurement data archive. The product has to meet specific metrics in term of accuracy, robustness and usability, as the initial decision-tree based development has shown limited applicability due to its resource intensive characteristics. We have developed an innovative solution that is much less resource intensive while offering comparable performance. As with many practical applications, the data available are noisy and correlated; and there is a wide range of features that are associated with the information to be retrieved. The proposed algorithm uses a decision tree to select features and determine their weights. A weighted Naive Bayes is used due to the presence of highly correlated inputs. The development has been successfully deployed in an industrial scale, and the results show that the development is well-balanced in term of performance and resource requirements

    KEYWORDS

    Machine Learning, Classification, Naïve Bayes, Decision Tree.


    For More Details :
    https://aircconline.com/csit/papers/vol9/csit90701.pdf


    8th International Conference on Soft Computing, Artificial Intelligence and Applications (SAI 2019)






menu
Reach Us

emailsecretary@cseij.org


emailcseijsecretary@yahoo.com

close