International Journal of Algorithms Design and Analysis
https://computers.journalspub.info/index.php?journal=JADA
<p align="center"><strong>International Journal of Algorithm Design and Analysis (IJADA)</strong></p><p align="center"><strong> </strong></p><p align="center"><strong>Click <a href="/index.php?journal=JADA&page=about&op=editorialTeam">here</a> for complete Editorial Board</strong></p><p align="center"> </p><p align="center"> </p><p><strong>International Journal of Algorithm Design and Analysis (IJADA)</strong> is a journal focused towards the rapid publication of fundamental research papers on all areas of algorithm design and analysis. It's a biannual journal, started in 2015.</p><p><strong>Journal DOI No: 10.37628/</strong><strong>IJADA</strong></p><p><strong>Readership:</strong> Graduates, Postgraduates, Research Scholars, in Institutions, and IT Companies</p><p><strong>Indexing: </strong>The Journal is indexed in Google Scholar</p><p align="left"><strong><span style="text-decoration: underline;">Focus and Scope</span></strong></p><ul><li>Divide and conquer</li><li>Dynamic programming</li><li>Greedy algorithms</li><li>Back tracking</li><li>Algorithmic languages</li><li>Divide-and-conquer algorithm</li><li>Dynamic programming</li><li>Amortized analysis</li><li>Linear programming</li><li>Linear-time sorting</li><li>Advanced algorithm design</li><li>Randomized algorithms</li></ul><p align="left"> </p><p><strong>Submission of Paper: </strong></p><p>All contributions to the journal are rigorously refereed and are selected on the basis of quality and originality of the work. The journal publishes the most significant new research papers or any other original contribution in the form of reviews and reports on new concepts in all areas pertaining to its scope and research being done in the world, thus ensuring its scientific priority and significance.</p><p>Manuscripts are invited from academicians, students, research scholars and faculties for publication consideration. </p><p>Papers are accepted for editorial consideration through email <strong>info@journalspub.com or nikita@stmjournals.com</strong></p><p><strong>Abbreviation:</strong> IJADA</p><p><strong> Frequency</strong>: Two issues per year</p><p><a href="https://journalspub.com/editorial-board/IJADA/"><strong>Editorial Board</strong></a></p><p><a href="https://journalspub.com/for-author/"><strong>Instructions to Authors</strong></a></p><p> </p>en-USInternational Journal of Algorithms Design and AnalysisDiabetes Prediction Using the Random Forest Algorithm and Machine Learning
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=893
<p>Early disease detection is crucial in the medical industry in order to prevent the disease. Diabetes is a perilous illness characterized by the body's inability to produce or properly respond to insulin, leading to abnormally high levels of sugar in the bloodstream. One of the deadliest diseases, it affects a large number of people. In order to prepare for the sickness, it is essential to understand its symptoms. This condition can be brought on by ageing, inactivity, inherited diabetes, a poor diet, high blood pressure, etc. The healthcare sector has a lot of databases, so using big data analytics, we can find knowledge in the data by looking for hidden patterns and unknown correlations, and then forecast the result appropriately. In order to improve classification prediction, we developed a diabetes prediction model in this study using a machine learning algorithm. Using a Pima Indian Dataset, this research work forecasts the presence of diabetes. To establish if a diabetes diagnosis is correct or incorrect, machine learning algorithms analyse the dataset. The training and testing portions of the dataset employed in this study are split 70:30, respectively. Based on a patient's current medical record, the model determines whether they have diabetes. With a prediction accuracy of 95.89, the recommended ML model exceeds the previously disclosed methods.</p>P. Hari KrishnaD. SnehardhiY. NaveenV. Siddhartha
Copyright (c) 2023 International Journal of Algorithms Design and Analysis
2023-04-152023-04-15823239Bridging the Gap of Learning Between Tutor and Student
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=894
<p>Students will be taught by qualified instructors with full verification, and any teacher is available upon request so that they can learn on any device at their own pace. Empowering teachers, students, and organizations with the tools they need to make a change. Converting professionals to coaches and gaining recognition for their knowledge. Keep 85% of your revenue while promoting your path and everything else handled for you. Structure and innovate your subjects in your own style with constant feedback from the students. Build a personal brand, teach anything you want, and make a significant financial contribution to students' lives.</p>Om BorkarDeep LahaneParshuram NikamRahul ShindeSangeeta Menon
Copyright (c) 2023 International Journal of Algorithms Design and Analysis
2023-04-142023-04-14822631Comparison of Machine Learning Classification Algorithms for Face Recognition Detection and Recognition: A Review
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=892
<p>Face Detection is a type of biometric technique that refers to the detection of a face automatically by computerized systems by taking a look at the face. It is a common function seen in digital cameras, biometrics, and social tagging. Over the past few years, face recognition and detection have drawn increased study attention. In this study, we investigated various methods for face detection and put them into practise using MATLAB software.</p>Gajanand GuptaVirendra JangidVaibhav GuptaRanjeet Singh SisodiaRavindra Arya
Copyright (c) 2023 International Journal of Algorithms Design and Analysis
2023-04-142023-04-14822025Prediction of Accident Severity Using Machine Learning Algorithms
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=891
<p>Even though they happen so regularly, car accidents usually rank among the most terrifying experiences a person can have. Using crash severity prediction models, various government agencies may learn more about the factors that influence or are related to collision severity, allowing them to foresee the seriousness of an accident. Machine learning algorithms may assist in finding trends to forecast the severity of an accident using accident data. To accurately forecast the accident severity for this study, we created a prediction framework and applied three different machine learning algorithms: random forest, logistic regression, and decision tree.</p>Meet Gudhka
Copyright (c) 2023 International Journal of Algorithms Design and Analysis
2023-03-312023-03-31821519Another Solution to Knuth’s Tree Traversal Problem
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=853
<p>In 1968, Donald Knuth posed the following problem on binary tree traversals which has engaged computer scientists ever since: Design a non-recursive algorithm that, without using an auxiliary stack, traverses an unthreaded binary tree in which one bit of temporary storage is allowed in each node. The tree can be modified in any way but must be restored to its original state before the algorithm ends. To date, several solutions to Knuth’s problem have been published. The best-known and quite elegant solution to this problem is the Morris algorithm, which also solves Knuth’s 1973 reformulation of the problem which disallows the use of any temporary storage bits (e.g., tag bits). But because the Morris algorithm dynamically threads the tree and then uses these threads during the traversal, the Morris algorithm arguably violates the requirement that the tree be unthreaded. Moreover, because its use of some of the right links in the tree essentially constitutes a stack, it also arguably violates the requirement that the algorithm be stack free. Of course, it can equally well be argued that the Morris algorithm is indeed a solution, given that Knuth’s problem states that the tree can be modified in any way as long as it is restored to its original state. The Schorr-Waite algorithm avoids the need for a stack by inverting the parent-to-child links to the child-to-parent orientation as it advances down the tree. The inverted links are then used to backtrack whenever a branch in the tree has been fully traversed. However, this algorithm requires a tag field it each node. Thus, it violates the requirement of the 1973 version of Knuth’s problem that temporary storage cannot be used. The Robson algorithm uses the same link-inversion technique as the Schorr-Waite algorithm. But it does not use tag fields. In place of the tag bits, it uses an internal stack (a stack dynamically created within the tree using the link fields that contain null pointers). Thus, it violates Knuth’s requirement that an auxiliary stack cannot be used. The algorithm we present uses neither an internal nor an external stack. Thus, it truly is a stack-free algorithm. Moreover, it does not modify the tree in any way during execution. Thus, multiple processes can traverse the same tree concurrently. The algorithm is applicable to any binary tree with the standard structure except that the link fields must contain address differences rather than absolute addresses. We call such a tree a D-tree (“D” for difference). The use of address differences in the link fields of the tree is what makes backtracking possible without the use of a stack. In a D-tree, backtracking and advancing, each requires only one computation (a subtraction to backtrack and an addition to advance). Highlights: • Another solution to Knuth’s tree traversal problem • A stack-free unthreaded tree traversal algorithm that does not modify the tree • A new representation of a binary tree.</p>Kristin MayoCameron PardoAnthony J. Dos Reis
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-11-142022-11-1482114WNNM with WMF for Improvement in Image Denoising and Image Segmentation Accuracy
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=798
<p>Denoising is an important task in image processing and diversity of algorithms have been proposed in image denoising. Weighted nuclear norm minimization (WNNM) is one of them. Despite good performance WNNM has in removing non-sparse noise, it is not so powerful in sparse noise denoising like salt-pepper noise. This study proposes a novel method combining weighted median filter (WMF) with WNNM to improve the denoising effect. Furthermore, this research also implements Markov random filed (MRF) to verify this improvement in denoising effect which can effectively improve the accuracy in image segmentation. Within our experimental results, PSNR result increases by 10.04 dB in maximum with WMF being added. Besides, the results show that our proposed method has better performance than combining traditional median filter with WNNM in most situations and it is better than combination of adaptive median filter and WNNM when salt-pepper noise is small. Finally, the experimental results show that our method is more effective than WNNM with more iterations in achieving higher PSNR result because time consumed in our method is remarkably less.</p>Sicong Li
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-07-222022-07-2282615Smart E-voting System Using Blockchain Technology
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=797
<p>Electronic voting (sometimes known as e-voting) is a voting system that involves keeping track of and precisely tallying the votes cast by users. A secure electronic voting system must prevent duplicate votes and be entirely visible while safeguarding voters' privacy. The shortcomings of the traditional voting system include the lack of vote dependability. There is no guarantee that the votes individuals cast are not tampered with before they are entered into the system. The voter and the system have no communication. We propose that blockchain technology be employed as a voting medium to alleviate all these difficulties. The benefits of employing the e-voting technology include reduced election expenses such as material, logistical, and salary costs. Politicians and management would have better access to the public's viewpoint. If a voter is unable to vote in person, he can vote remotely. As a result, it improves total attendance. E-voting can be extremely beneficial because it allows anyone to simply access the election, cast their votes, and announce their preference. People can share private hyperlinks to any produced poll with anybody who knows the link, and anyone who knows the link can vote, but only one browser can vote. In terms of voter verification, duplicate votes, and nonrepudiation of votes, the security here is pretty poor. E-voting is being explored in depth, and various systems have been tried and even deployed for a period of time. However, only a few implementations are sufficiently trustworthy and continue to be used.</p>Satyam DanawaleSumit BathePranay Jambhavdekar
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-07-222022-07-228215A Study on Object Detection Using Computer Vision Techniques
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=813
<p>Object detection is a method used in computer vision. Finding a specific object in a series of images or videos is the process of detection. Several contributions for foreground detection and tracking have recently been proposed and effectively demonstrated. The primary problem in motion tracking algorithms is estimating object motion as precisely and efficiently and feasible. Moving object detection is critical in all surveillance applications, including video analysis, video communication, traffic management, medical imaging, and military duty. For identifying objects, we employed Computer Vision methods such as Single Shot Detection and Neural Network Algorithms. Some of the most powerful tools in machine learning and artificial intelligence are object detection algorithms for computer vision problems. These are judgement algorithms that allow computer systems to draw conclusions about the real world as seen via a camera. Robots that move items, autonomous vehicles, and picture classification software would be very difficult to develop without object recognition.</p>Sapna Verma
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-07-222022-07-22823146Demand Forecasting: Your Best Bet for Business Growth
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=812
<p>Demand forecasting is undeniably the most important component of any supply chain. It establishes the level of readiness needed on the supply side to meet the demand and estimates the demand for the anticipated future requirement. If an organization does not predict demand with certain accuracy, the supply chain takes a huge hit. Seeing the importance of demand forecasting, let us discuss about the forecasting techniques which are used to predict the future demand. In order to produce accurate forecasts and assess forecast accuracy, both the input and the modelling engine are crucial. The purpose of the project is to identify an efficient solution to forecast of demand so that it improves cost management and retail management for the case of the furniture industry. Company uses forecasting method based on the moving average, on a monthly scale, to estimate the following period. Project goal is to make a model to forecast demand using Time series forecasting and SARIMA model</p>Om VadnereNeha MahajanAkshada Veer
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-07-222022-07-22821621Image Processing and Classification of Underwater Reef Images
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=804
<p>The oceans play a crucial part in the cycle of life on earth by containing undiscovered items and materials as well as abundant energy supplies. Many scientists/researchers are doing their research on such objects, and for these research, they required a clear quality of underwater images. While these underwater images play an important role in ocean exploration, the absorption and scattering of light in aquatic media often results in significantly reduced quality. The use of new approaches for enhancing the quality of underwater photos is ongoing to create images of higher quality, even though there have lately been significant advancements in the field of image processing. Here, we describe how to enhance and restore images to deal with typical underwater image degradation, such as extreme degradation and distortion. Here we are trying to reduce the noises, to sharpen or brighten images and to restore in clean or original underwater images by using various image processing techniques and, we are using convolutional neural network for classifying some of the major types of coral reef such as Pocillopora, Turf, Sand, Pocill, Macro and Porit i.e., Underwater Image Classification.</p>Sujata BhairnallykarDeeshant Dinesh SinghAkhil Shailendra SharmaNitesh Mayaram Yadav
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-07-222022-07-22822230Genetic Algorithm Based Information Retrieval in Fault Diagnosis Systems
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=765
<p>Plant superior management systems need a reliable management of multiple freelance faults, which is crucial for supporting plant operators’ decision-making. In this regard, the MultiLabel approach with Support Vector Machines (ML&SVM) used as the base learning formula was recently proposed for addressing the problem of coinciding fault identification in the chemical process mistreatment coaching sets comprised solely of single fault data. A new method for improving fault diagnosis performance has been proposed, which includes feature extension using the variance and linear trend of the data sets, feature reduction using Genetic Algorithms (GA), and then applying the selected options to the new training set. The identification performance was tested on the Tennessee discoverer benchmark and it absolutely was measured and compared mistreatment the F1index. Excellent results were obtained for seventeen of total of the twenty faults in the case study, while three faults (3, 9, and fifteen) could not be classified properly. As a result, the approach's capability and limitations are finally discussed.</p>E. N. Ganesh
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-03-222022-03-22823136Analysis and Design of Efficient HDFS Data Encryption Algorithm: A Review
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=764
<p>Large amounts of structured and unstructured data can be handled using big data. Companies utilize Big Data for a variety of purposes, including improving customer service and taking other profitable actions. Big Data gives benefits such as better customer insight, improved performance, more insight full market performance, agile supply chain management, data-driven innovation, smart recommendations, and targeting to the business. Hadoop is a well-built technology which mainly used for the distributed processing and storage of a large amount of data. Hadoop itself does not provide any security so the encryption zone is used for security purposes, but this is a weakness of this strategy from the attacker’s point of view. The primary objective of this research is to look into the different encryption techniques utilized in the Hadoop Distributed File System. In this paper, we have discussed ‘data at rest’ HDFS data encryption algorithm. We choose the efficient algorithm that gives better performance at the storage level, when working with big data on the cloud. Firstly, we have discussed the introduction part to explain the paper’s background, review of the literature to discuss the theory behind the research, research methodology, comparative analysis, and last conclusion and future work.</p>Shivani AwasthiNarendra Kohli
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-03-222022-03-22821830Predictive Data Mining for Cardiopathy
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=762
<p>Large amounts of statistics and data is collected by the healthcare industry on a daily basis. Unfortunately, the important and crucial ones are hidden and are “dug” data Information to conclude important and efficient decisions. Complex hidden patterns and relationships are generally unexplored. Cutting-edge Data Mining approach can be used in resolving this particular condition. This study is proposing a prototype model called Cardiopathy Prediction System (CPS) using data mining methods, namely Decision Trees, Naive Bayes and the Neural Network. Each of the three models is effective in achieving its objectives in its own way. CPS can provide a complex "what if" answer Questions that classic decision support systems are not capable of. Using the Medical statistics which includes various attributes for example sex, blood pressure, sugar and age patients predict heart disease risk Disease. This allows important knowledge, like Samples, Correlation among medical aspects associated with cardiopathy is formed and can be used. CPS is user-friendly, scalable, dependable, and expandable.</p>Siddhant SanadhayaSandeep Tuli
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-03-222022-03-22821317An Improved Genetic Algorithm with a Min-Min Approach for Effective Task Scheduling in Cloud Computing
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=761
<p>Cloud computing has emerged to be a great computing model, which allows to offer offerings on demand. It renders metered offerings. Making green use of sources thru the discount of execution time and cost and consequently, maximizing the earnings is the number one goal of cloud carrier provider. Hence, utilising green scheduling algorithms nevertheless stays an essential project in cloud computing. Job making plans in addition to weight balancing withinside the Virtual Machine (VM) and lowering the makes pan concerned in finishing the duties are the essential studies worries. The Advanced Genetic Algorithm with Min-Min (IGAMM) technique is proposed in this paper for cloud green assignment planning. In the newly introduced method, the workload placed on the machines is distributed and reduced according to their power. The main objective of this approach is to reduce production time, increase utilization of resources and reduce the amount of force consumed. A provided method for workload planning is based on clustering of virtual machines in a cloud environment. The goal of the radical approach is to improve the overall performance of cloud computing by minimizing build times and response times, and maximizing virtual machine utilization. Evaluation of the basic rule set is performed with the available Engineering query phrases of various aggregate performance indicators. The consequences of the evaluation show that the proposed IGAMM implementation rule set is much more advanced than current techniques.</p>B. M. RajeshAntony Selvadoss ThanamaniB. ChithraA. Finny BelwinA. Linda Sherin
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-03-222022-03-2282612Analysis and Review of Sorting Algorithms
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=740
<p>These essential issue in programming designing being entioning once-over of things. Notwithstanding the way that there is a titanic number of orchestrating estimations, masterminding issue has pulled in a great deal of research; in light of the fact that gainful organizing is basic to propel the use of various figurings. Masterminding computations have been focused generally since late decade. Their usages are found in various approaches include continuous system, working structures, and non-continues diversions. A significant part of the time, the productivity of the approach itself endless supply of masterminding figuring. Recently, utilization of sensible card for extensively helpful figuring again come back to orchestrating computations. At the present time extended our past work concerning look like orchestrating figuring’s at GPU and are representing the assessment of equivalent & continuous bionics, even-odd & rank-sort estimations in various kind of GPU and CPU structures. Their show for different line length is assessed concerning organizing any occasion, masterminding counts is showed up on changed GPUs and CPU.</p>Abhishek .Vikram Khandelwal
Copyright (c) 2022 International Journal of Algorithms Design and Analysis
2022-02-082022-02-088215Virtual Assistant for Desktop
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=713
<p>Nowadays we are stepping towards to digital world. We can see that we are surrounded the technology and it has become our habit so means we cannot live without it. But looking at the some people like we can say the handicap people which are still some were unknown of all this things which are letting them still in the past world. So this type of people who are physically disordered and all useful for the normal people for speed purpose this software is been developed “Virtual Assistance for Desktop”. To help the people as well as to introduce handicap as well as illiterate people. Here it is fully software based.</p>Aishwarya GaikwadVarad DeshpandeSushant ParabAjay BharsakalePooja Ghodke
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-07-152021-07-15822327Remote Sense to Summarize Sensor Signals
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=712
<p>This study is performed to learn reflective perversion and schemed result orientated perversion of retrieved datum after sensing signals of remote space. Graphical annotated scheme has subsidized remote sensed images to interpret as decisive as usual applicable sophistication through algorithm simplification. Surveys from objected capture of multiple sensors have adage perverted 3D images detrimental after suit and prompt fusion. Datum informs in his study to subsidize real hided as continuous algorithms inclusive based on differential solar irradiance, low in extensive thermal emission, day/night emission and effect of atmosphere and temperature. Measured parametric simulation has been linked to Landsat image formation from area as recognized after regulation from empty-set mass, Jousselme’s distance and cosine measure.</p>B. Goswami
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-07-142021-07-14821922Automatic Hate Speech Notifier Using Machine Learning Techniques
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=718
The Internet which was once considered to be a part of only high class society is now within the reach of every section of society. Due to which the number of online platform users has increased drastically. According to a survey the number of social media users has reached 1 billion. Social media platforms provide us an online platform to share our thoughts, ideas , perspectives to a larger section of a society at an instant of time with a touch of a finger. But along with these benefits it also gives rise to certain problems and one of those problems which took a steady increase in the recent decade is online hate speech. It is one of the major problems that is faced by countless people throughout the world, because of which research papers are published every year to spread awareness about online hate speech and what are the countermeasures we can take to put an end to it. In our research paper we have proposed a model of a social networking site that is capable of predicting offensive or hateful content whenever a user creates or updates a post. For creating the machine learning model we use different machine learning algorithms like passive-aggressive classifier, SVM, Naive Bayes etc and use the most accurate one for our model.Dev SharmaAnkur Singh RawatAman SinghalAnkit VermaAshish Kumar
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-07-142021-07-14823240Survey Paper On Divide & Conquer Algorithm
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=719
<p><em>To manage data in database system insertion, deletion, sorting, searching, update etc. operation have to be performed. The sorting algorithm helps data to make in order and helps to search data faster, if the sorting algorithm is better then it increases the performance of database management system. Divide and conquer algorithm helps the database management system to increase the performance of sorting. It helps in many aspects such as space, time. Divide and conquer algorithm also follows iterative as well as recursive method for sorting. This paper aimed to discuss different sorting algorithm and comparison between them. It analyses algorithm with respect to time and space.</em><em></em></p>Anjali D. JaiswalKunal R. More
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-07-142021-07-14822831A Modified Cuckoo Search for High-Dimensional Data using Fast Adaptive K-Means Algorithm
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=711
High-dimensional structures within various requirements of the contemporary environment define knowledge. On this paper, a concept of prototype subspace clustering of Fast-Adaptive K-mean (FAKM) for adaptive loss function is developed on the approach to delivering a flexible cluster pointer goal. FAKM was recently developed to finish the clustering and feature selection process. The FAKM algorithm, on the other hand, ignores the disintegration of its value. FAKM is a measure of the added value to real-world applications. The main goal of this article is to propose a novel clustering method for the Frobenius norm's optimum consequence. A proposed model is solved using the Modified Cuckoo Search (MCS) optimization method at this step. Because the MCS has certain unique characteristics, such as relaxed operation, constant conjunction reflectivity, and effective computational uprightness, it may be used to aggregate cluster points in the most efficient way possible. The new method was finally put to the test utilizing a variety of benchmark datasets from the University of California Irvine (UCI) repository, as well as sophisticated clustering algorithms. The result measures such as Accuracy (ACC), Normalized Mutual Information (NMI), and Error Rate (ER) are estimated using a variety of three datasets such as Breast, Glass, and WebKB.D. KarthikaK. Kalaiselvi
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-07-132021-07-1382111Face Recognition of Criminals and Missing Individuals
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=710
Face recognition is one of the most trending area of research. One of the applications includes Face Recognition of criminals and missing individuals. Criminal record usually contains personal information about the accused along with the photograph. In practice, identification of criminals is done through biometric identification. Despite that, face recognition can be more efficient in tracking down of individuals. Video recording of CCTVs installed at various public places are used for processing. In case the quality of video recording is poor, it is enhanced. This paper proposes an android application which detects criminal/missing individuals using Haar cascade algorithm and the features are extracted using Siamese network algorithm which are stored on Firebase. Further, using these stored features, face recognition is executed. The application is built using React native as frontend and Django framework as back-end.Anjali Dilip JaiswalAnanya V. KamalapurKunal R. MoreSujata Bhairnallykar
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-07-132021-07-13821218Missing Data Imputation in Large Data set Using Chernaïve Classifier
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=641
<p><strong>ABSTRACT</strong><br />Data mining is a knowledge domain for information industries and communal sites by abstracting and refining the information from massive architecture. The Objectives of the study highlights the transformation of the limitations of Multiple Imputation in Large Data set through Adaptive boosting Algorithm, Naïve Bayesian (NB), J48 algorithm, CHAID and to construct Chernaïve Classifier to enable Missing Data Prediction on defining the large data set. This research aims to prove that Chernaïve Classifier overcomes the limitations of Decision Tree (DT), Adaptive Boosting (ADAB), Naive Bayesian and J48. A Mathematical model is constructed implementing the features of Chernaïve Classifier. This model overcomes the issue of independence classifier and boosting techniques, to implement the prediction of missing data in the historical data items. To implement every stages of the research work standard expertise tools like MATLAB, and SPSS for evaluation were used.</p><p><br /><strong>Keywords:</strong> Chernoff Bounds, Naive Bayes classifier, Decision Tree, Adaptive Boosting, J48, Missing Data Imputation, Classifier, Chernaïve Classifier</p>Dr. A. Finny Belwin
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-02-272021-02-27825374Detailed Analysis and Simulation of Various Process Scheduling Algorithms
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=662
<div><p align="center"><strong><em>ABSTRACT</em></strong></p><p><em>CPU scheduling is a dominant theory in multiprocessing, multitasking operating systems, time-sharing, and designs of a real-time operating system. The CPU scheduler system deals with the problem of which processes are to be assigned to the CPU in the ready queue. The working system will make the PC more profitable by exchanging the CPU for measures.In this paper, we will be looking over various parameters which results for process scheduling algorithm performance like waiting time, turn-around time, fairness, priority, response time, throughput for the algorithms such as first come first serve scheduling algorithm (FCFS), shortest job first scheduling algorithm (SJF),non-preemptive, shortest job first scheduling algorithm (SJF) preemptive, round robin scheduling algorithm (RR) and priority scheduling algorithm.</em></p><p><strong><em> </em></strong></p><p><strong><em>Keywords: </em></strong><em>CPU scheduling, scheduling algorithms, multitasking, multiprocessing</em></p></div>Shubh MehtaHarshad Mehta
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-02-272021-02-27824352Enhanced Artificial Neural Network with Particle Swarm Optimization Algorithm for Detecting Distributed Denial of Service in Cloud
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=661
<p><strong>ABSTRACT</strong></p><p><strong></strong><br />The disseminated processing model addresses another adjustment in standpoint in electronic organizations that passes on profoundly callable dispersed processing organizes in which computational resources are advertised ‘as assistance’. In spite of the fact that the cloud replica is intended to receive infinite rewards intended for every cloud partners including cloud providers (CPs), cloud customers (CCs), plus service providers (SPs), replica still has various open issues like security that effect its believability. In this job, Enhanced Artificial Neural Network with Particle Swarm Optimization (EANNPSO) is projected in order to progress the cloud execution. Information security management systems (ISMS) are characterized frameworks that give a model to setting up, actualizing, working, observing, inspecting, keeping up and improving the insurance of data resources. An attacker can make use of a cloud to give a vindictive appliance to achieve his thing which possibly a Distributed Denial of Service (DDoS) assaults in opposition to cloud itself otherwise orchestrating one more client in the cloud. In this work, EANNPSO algorithm is projected to identify the DDoS attack efficiently using optimal objective values and hidden neuron values. The proposed EANNPSO system provides higher security performance than the existing methods.</p><p><br /><strong>Keywords:</strong> Cloud computing, Enhanced Artificial Neural Network with Particle Swarm Optimization (EANNPSO), Distributed Denial of Service (DDoS)</p>B. M. RajeshB. ChithraA. Finny BelwinA. Linda SherinAntony Selvadoss Thanamani
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-02-272021-02-27823442Missing Data Imputation in Large Data Set Using Chernaïve Classifier Influenced by Bolzano–Weierstrass Theorem
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=663
<p><strong>ABSTRACT</strong><br />Data mining is a part of a super ordinate process called knowledge discovery in databases (KDD), where the term “database” refers to any kind of data storage and does not solely comprise data stored in database management systems. Data mining is integrated as a single step of the KDP, usually between model selection and interpretation. Knowledge Discovery in Databases (KDD) is the non-trivial process of identifying valid, novel, potentially useful, and ultimately understandable patterns in data. Knowledge discovery in databases describes the non-trivial process of identifying valid, novel, potentially useful, and ultimately understandable patterns in data. In other words, KDD addresses the problem of mapping low-level into other forms that might be more compact, more abstract or more useful. The key conjecture of the study highlights the transformation of the limitations of Multiple Imputation in Large Data set through Adaptive boosting Algorithm, Naïve Bayesian (NB), J48 algorithm, CHAID and to construct Chernaïve Classifier to enable Missing Data Prediction on defining the large data set. This research aims to prove that Chernaïve Classifier overcomes the limitations of Decision Tree (DT), Adaptive Boosting (ADAB), Naive Bayesian and J48. The influence of Bolzano–Weierstrass theorem on Missing data Imputation is the key objective of this article. A Mathematical model is constructed implementing the features of Chernaïve Classifier and treated over SONAR data set. This model overcomes the issue of independence classifier and boosting techniques to implement the prediction of missing data in the historical data items. To implement every stage of the research work, standard expertise tools like MATLAB, and SPSS for evaluation were used.</p><p><br /><strong>Keywords:</strong> Chernoff Bounds, Naive Bayes classifier, Decision Tree, Adaptive Boosting, J48, Missing Data Imputation, Classifier, Chernaïve Classifier, Bolzano–Weierstrass theorem</p>Dr. A. Finny Belwin
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-02-272021-02-27821833Design and Implementation of 64-Bit Arithmetic Logic Unit on FPGA Using VHDL
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=660
<div><p align="center"><strong><em>ABSTRACT</em></strong></p><p><em>A 64-bit ALU is designed and implemented using VHDL and simulated on a Xilinx simulator. The ALU is the basic building block of a processor. The ALU performs logical, arithmetic, and shifting operations onto data in the processor. The proposed design may find application where automobile and control are required. Arithmetic Logic Unit part of the Central Processing Unit which performs arithmetical operations such as addition, subtraction, division, multiplication, etc. logical operations such as OR, XOR, etc., and shift-rotate operations. reusable systems provide a solution to solve complex problems by combining them with hardware speed. The flexibility of software improves system performance. In the last three decades,introduction to technology that has fundamentally changed one and controls the world. Performance can be greatly improved without losing accuracy in revealing computational precision inmany applications. To enable this, we propose a new arithmetic logic unit (ALU), architecture perfect dynamic on the fly supports the precise operation. Same as operation ALU becomes more complex, becomes more expensive, and takes up more space in the CPU so powerconsumption is a major problem. VHDL synthesized coded RTL code of fixed point is an arithmetic core. The functions of fixed-point arithmetic were verified by simulations with the single instruction test as the first point. And then implemented fixed-point arithmetic with FPGA. To handle more challenges nowadays and the demand for complex tasks is increasing day-by-day to increase the efficiency of a processor resulting in more number of components manufactured on a single chip according to Moore’s law. Arithmetic Logic Unit is the structure of the central processing unit. CPUs contain very powerful and complex ALUs. In addition to ALUs, today’s CPUs have a control unit that operates the ALU through the control signals. These signals tell to ALU which operations will be performed and the ALU stores results of these operations in output registers. Also, the CU moves the data between these registers, the ALU, and memory through the control signals. VHDL is one of the most popular languages of an industry for the modeling, description, and synthesis of digital circuits and systems. It is a high-level language that is difficult to learn, and suitable for the design of complex systems. Also, this language allows users to create complex data types. Design units, also called library units, are the main components of the VHDL language.</em></p><p><strong><em> </em></strong></p><p class="Default"><strong><em>Keywords:</em></strong><em>ALU, VHDL, control unit</em></p></div>Sateesh KouravSunil Shah
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-02-272021-02-2782917A Review on Diabetes Prediction Using Machine Learning Algorithms
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=655
<p><strong>ABSTRACT</strong><br />Data mining (DM) is a way of dealing with substantial information sets to recognize patterns and setup associations to resolve issues through data investigation. Diabetes is a chronic disease or group of metabolisms in which a person suffers from a prolonged blood glucose level in the body that is not sufficient for either production of insulin or because cells of the body are not adequate for insulin. Diabetes’ constant hyperglycemia is linked to long-distance harm, breakage as well as a failure of different organs, especially eyes, kidneys, nerves and heart. Diabetes diagnosis at an early stage is difficult in real-world medical issues. This paper has examined different data mining parameters for a diabetes diagnosis. We have also been studying data mining classification algorithms that also play a vital role in the DM process. Further study can be extended to include some other machine-learning algorithms for the automation of diabetes analysis.</p><p><strong>Keywords:</strong> Diabetes, diabetes prediction, machine learning, data mining</p><p> </p>Esther BezzemHarendra SinghMegha Kamble
Copyright (c) 2021 International Journal of Algorithms Design and Analysis
2021-02-232021-02-238218Online Training and Management Support Center
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=565
<p>Computerized Personal Briefcase is developed to facilitate the users to save and retrieve their personal data whenever and from wherever possible. This helps users access their personal data quickly and easily as and when required, thereby improving his/her operational efficiency and effectiveness. In today’s competitive environment where everybody wants to be on the top, information plays very crucial role. As fast as information is accessed and processed, it can give good results. Today, Internet is the fast way of transferring data and information over wide area, hence I have used internet as a way for exchanging information. Computerized system helps to fulfill these goals. Computerization of the ability analyzing will help in doing lot of manual work quickly. It will help in easy storage and access of all information in short period of time.</p><p>Keywords: Computer, data, internet, online training, management system</p><p>Cite this Article: Neha Verma, Nidhi Verma. Online training and management support center. International Journal of Algorithms Design and Analysis. 2020; 6(1): 32–53p.</p>Neha VermaNidhi Verma
Copyright (c) 2020 International Journal of Algorithms Design and Analysis
2020-06-272020-06-27823253Monitoring and Analysis of Water Quality using IoT and SVM
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=568
<p>Nowadays, Internet of Things (IoT) is used for monitoring, collecting and analysing of data from remote locations. Water pollution is one of the biggest problems to the world due to industrialization. Water is an essential need and therefore there must be mechanisms to test the quality of water. This paper focuses on providing an IoT platform for analysing the data which is collected from sensors. Sensor data can be viewed on the Internet. Water quality index is calculated using weighted arithmetic index method. The parameters considered are pH, temperature, turbidity, dissolved oxygen, total suspended solids. The main focus is to analyse the data and classify the quality of water as excellent, good, poor, very poor and unsuitable for drinking with the help of support vector machine algorithm. Results are displayed in the user interface which is developed using flask.</p><p>Keywords: Algorithm, IoT, sensors, support vector machine, water quality</p><p>Cite this Article: K. Purushotam Naidu, K. Anusha, A. Neelima, K. Jahnavi, K. Sai Harshitha. Monitoring and analysis of water quality using IoT and SVM. International Journal of Algorithms Design and Analysis. 2020; 6(1): 24–31p.</p>K. Purushotam NaiduK. AnushaA. NeelimaK. JahnaviK. Sai Harshitha
Copyright (c) 2020 International Journal of Algorithms Design and Analysis
2020-06-272020-06-27822431Online Weather Forecaster
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=566
<p>Weather forecasting is one of the most challenging problems of the world because it uses multidimensional and linear data from various fields. This paper describes data mining algorithms namely regression algorithm and k-nearest neighbour. The algorithms are used for prediction. Using collected datasets, the Frequent Pattern Growth Algorithm for deleting the inappropriate data is applied. Temperature, humidity, and wind speed are mainly responsible for the weather prediction. On the percentage of these parameters, temperature, humidity and rainfall are predicted. Weather forecasting is an application of science and technology to predict the atmosphere of a particular location for specific range of time. It was one of the most scientifically and technologically challenging problems around the world in the last century. Since ancient times, weather prediction has been one of the most interesting and fascinating domain. Once an all-human endeavour based mainly upon changes in barometric pressure, humidity, temperature and sky condition, weather forecasting now relies on computer-based models that take many atmospheric factors into account. Human input is still required to pick the best possible forecast model to base the forecast upon, which involves pattern recognition skills and knowledge of model performance. The chaotic nature of the atmosphere, the massive computational power required to solve the equations that describe the atmosphere, error involved in measuring the initial conditions, and an incomplete understanding of atmospheric processes mean that forecasts become less accurate as the difference in current time and the time for which the forecast is being made increases. Use of data mining techniques in forecasting maximum and minimum temperature, rainfall, humidity and wind speed was investigated. This was carried out using regression technique and meteorological data collected from the previous years. The performances of these algorithms were compared with weather data for the predicted periods and the algorithm which gave the best results was used to generate classification rules for the mean weather variables. Responsible parameters for weather prediction are temperature, pressure and humidity.</p><p>Keywords: Algorithm, data mining, k-nearest, technology, weather forecasting</p><p>Cite this Article: Nidhi Verma, Niru Pandey, Krishna Giri. Online weather forecaster. International Journal of Algorithms Design and Analysis. 2020; 6(1): 13–23p.</p>Nidhi VermaNiru PandeyKrishna Giri
Copyright (c) 2020 International Journal of Algorithms Design and Analysis
2020-06-272020-06-27821323Review on RFID based Wallet System and Design and Implementation of Client-Server System with Peripherals for Student Centric Applications
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=569
<p>mainstream applications that help speed the handling of manufactured goods and materials. Individuals are extremely keen about concluding their business (money) transactions and virtually each task through internet-based services. A system is created to integrate all applications in one system by using Universal System victimization RF-ID (Radio Frequency Identification) card. A universal system strictly designed to require entire advantage of the facilities provided by the institute that helps students to avoid wasting time exceedingly during their busy schedule. This is achieved by automating many services by the institute. This also includes automation of trip management of field to field bus services, all controlled through an applicable controller like Arduino, Raspberry-pi. An internet page access to administrator is provided to manage student wallet where students are connected to the system through a client-server system which stores the database of the student information.</p><p>Keywords: Arduino, embedded, client-server, RF-ID, student database</p><p>Cite this Article: Rushikesh Devre, Unmesh Dhere, Yash Gujarathi, Shubham Ghag, Swati Shinde. Review on RFID based wallet system and design and implementation of client-server system with peripherals for student centric applications. International Journal of Algorithms Design and Analysis. 2020; 6(1): 6–12p.</p>Rushikesh DevreUnmesh DhereYash GujarathiShubham GhagSwati Shinde
Copyright (c) 2020 International Journal of Algorithms Design and Analysis
2020-06-272020-06-2782612Design of Thinned Triangular Fractal Antenna Array using Genetic Optimization Algorithm
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=567
<p>This research work mainly focused on the reduction of side lobe levels and minimizing the number of array elements of the triangular fractal array using evolutionary optimization technique. A thinned triangular fractal array for four successive iteration levels has been designed with and without using genetic optimization algorithm and the outputs of these two cases were compared. Owing to the evolutionary technique, nearly 50% of the thinning is achieved in all four iterations.</p><p>Keywords: Antenna array, evolutionary, fractal, genetic, optimization algorithm</p><p>Cite this Article: Praveena A., V.A. Sankar Ponnapalli. Design of thinned triangular fractal antenna array using genetic optimization algorithm. International Journal of Algorithms Design and Analysis. 2020; 6(1): 1–5p.</p>Praveena A.V.A. Sankar Ponnapalli
Copyright (c) 2020 International Journal of Algorithms Design and Analysis
2020-06-272020-06-278215Stabilizing 360° Videos
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=522
<p>In this paper, we concentrate on stabilizing the 360° videos which when captured instantly causes unnecessary shakiness result in cyber sickness to the spectator. 360° videos are video recordings where we get the complete view around us at the same time. We implemented our algorithm in Matlab version R2016b. In our algorithm, we first convert the unstabilized 360° video which is usually represented in equirectangular to cubemap. Then we extract Speeded Up Robust Features (SURF) features from each of the faces and estimate a set of outliers and inliers using RANSAC algorithm. When we apply the warping technique in each frame, we can stabilize the 360° video.</p><p>Keywords: Cubemap, feature extraction, SURF features, RANSAC, warping</p><p>Cite this Article: Farzana Nazim, Krishna P, Reshma Chandran C R, Afsal S. Stabilizing 360° Videos. International Journal of Algorithms Design and Analysis. 2019; 5(2); 52–58p.</p>Farzana NazimKrishna PReshma Chandran C RAfsal S
Copyright (c) 2020 International Journal of Algorithms Design and Analysis
2020-02-122020-02-12825258Neural Modelling Approach to Integrated Process Control
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=512
<p>Control system engineers have contributed to the development of PID (proportional, integral, differential) controllers for use in the process control industry. These developments are based on control theory principles such as poles-zeros cancellation, root-locus method and disturbance (white noise) rejection, etc. There is also wide literature using artificial neural networks (ANN) applications to integrate statistical process control (SPC) and automatic/engineering process control (APC/EPC). These literature publications are heavily mathematics based of higher order and may not be within the grasp and comprehension of most of the researchers with mathematical knowledge of college or university graduate standards. In order to overcome these difficulties in wide knowledge gap, this technical note discusses and suggests an ANN approach into the design of quality regulator that uses integral regulator algorithm, (given below developed from first principles), the mathematical and stochastic modelling and feedback control algorithm are developed and based on research publications of Box and Jenkins, technical literature available in Technometrics journals, IEEE Transactions, Journals of Royal Statistical Society and Quality Technology, etc. It is expected that the quality integral regulator developed using neuron models and ANN principles will also optimize variance of product quality variable through predictive and adaptive process control regulator algorithm. The quality integral regulator will help minimize output quality variations due to process operating conditions. The regulator algorithm developed on the basis of ANN theory and principles will answer challenging demands of modern computer remotely controlled complex dynamic processes. It may not be possible to understand well the technical capacity of the existing PID and industrial controllers to extend their process control capabilities up to certain level and extent only. The quality integral regulator that can be designed and developed on the basis of ANN models will provide and meet the needs of the modern, process control industry for efficient quality control.</p><p>Keywords: SPC, APC, ANN, PID controller, stochastic, integral, regulator, algorithm</p><p>Cite this Article: G. Venkatesan. Neural Modelling Approach to Integrated Process Control. International Journal of Algorithms Design and Analysis. 2019; 5(2): 46–51p.</p>G. Venkatesan
Copyright (c) 2020 International Journal of Algorithms Design and Analysis
2020-02-122020-02-12824651Simulation of Perturb and Observe Maximum Power Point Tracking Algorithm for Photovoltaic System
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=521
<p>Photovoltaic (PV) power is a renewable energy source which enact a dominant role in electric power generation. It has become essential nowadays on account of the inadequacy and environmental brunt of conventional fuels. Solar photovoltaic based energy generation meets an important substitute for many purposes with the features of being clean, environment friendly, non-exhaustible, sustainable and available in throughout the world. Conversion of renewable energy into electricity improves the generation and reduces the CO2 emission. When it comes to PV power generation, it mainly depends on temperature and solar irradiation. Since the current-voltage characteristic curve changes with the solar irradiation and temperature dynamically. So that to harvest maximum solar energy at a distinct point it is imperative to control the photovoltaic generation. Therefore, an MPPT algorithm is needed to estimate the effective operation of photovoltaic power generation. In this paper, to harvest maximum available power from the PV panel the perturb and observe (P&O) maximum power point tracking algorithm is used and implement it by utilizing zeta converter. The reason behind of choosing this converter is its essential properties of fast response, simple circuit, very large range of power tracking efficiency etc. The analysis is carried out on MATLAB/Simulink software.</p><p>Keywords: photovoltaic (PV), maximum power point tracking (MPPT), perturb and observe (P&O), zeta converter, PV panel</p><p>Cite this Article: Anusha Anil Kumar, Gayathri V. Simulation of Perturb and Observe Maximum Power Point Tracking Algorithm for Photovoltaic System. International Journal of Algorithms Design and Analysis. 2019; 5(2): 37–45p.</p>Anusha Anill KumarGayathri V
Copyright (c) 2020 International Journal of Algorithms Design and Analysis
2020-02-122020-02-12823745Application-Quality Integral Regulator
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=511
<p>Process control practitioners have made lots of contributions in the past half century to integrate statistical process control (SPC) and automatic/engineering process control (APC/EPC). The quality regulator uses integral regulator algorithm’, (given below), developed and based on research publications of Astrom, Box and Jenkins, technical literature available in technometrics journals, IEEE Transactions, Journals of Royal Statistical Society and Quality Technology etc. The Quality Integral Regulator optimizes variance of product quality variable through predictive and adaptive process control regulator algorithm. The quality integral regulator minimizes output quality variations due to process operating conditions. The regulator algorithm answers challenging demands of modern computer remotely controlled complex dynamic processes. It may not be possible to understand well the technical capacity of the existing controllers to extend their process control capabilities up to certain level and extent only. The quality integral regulator use provides practical solutions to such demands and meet the needs of the modern, process control industry for efficient quality control without depending on the existing SPC-APC process control method.</p><p>Keywords: automatic/engineering process control (APC/EPC), statistical process control (SPC), quality integral regulator, autoregressive integrated moving average, transfer function noise</p><p>Cite this Article: G. Venkatesan. Application—Quality Integral Regulator. International Journal of Algorithms Design and Analysis. 2019; 5(2); 26–36p.</p>G. Venkatesan
Copyright (c) 2020 International Journal of Algorithms Design and Analysis
2020-02-122020-02-12822636Algorithms Design and Analysis in Medicine and Fluids Mechanics
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=508
<p>The computer mathematical modeling is the most powerful tool of knowledge, analysis and design for the scientific researchers, who carry out the development of complex medical and hydromechanical technological processes. The computer modeling gives the possibility to researcher to simulate the objects in such cases, when the real object is practically impossible or economically inexpediently. In many cases, the computer mathematical modeling permits to learn new, unknown before. The paper represents conducted by author the computer mathematical modeling of the time of appearance metastases and estimation of the results of the treatment among oncological patients after treatment and the computer mathematical modeling of the behavior of the gas bubble, fixed on the surface of the solid material, under the change of the acceleration of gravity. The results of the models works were experimentally proved.</p><p>Keywords: computer mathematical modeling, algorithms design and analysis, computer mathematical modeling of the time of appearance metastases, computer mathematical modeling of the behavior of the gas bubble</p><p>Cite this Article: Michael Shoikhedbrod. Algorithms Design and Analysis in Medicine and Fluids Mechanics. International Journal of Algorithms Design and Analysis. 2019; 5(2); 1–25p.</p>Michael Shoikhedbrod
Copyright (c) 2020
2020-02-122020-02-1282125Smart Machine Power Consumption and Monitoring System in Industry
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=505
Energy monitor has an important role in identifying energy conservation opportunities in different industrial areas. The proposed system monitors the power consumption by different sections of the plant. It enables to understand sudden changes in energy consumption with various aspects like power, temperature, load, etc., during certain process. The aim is live monitoring by using Arduino for gaining information about industrial drives. This information gives us power requirement of the machine with respect to distinct specifications. The benefit of thermostat is analyzing several of temperatures accorded to the load. This system plays a vital role in observing and managing the energy concern of the section of the plant. Keywords: Arduino, LED display, motor drive, relay, thermostat Cite this Article: Preetha S., Kowsalya G., Malarvizhi M., Manjuthaa N., Manobharathi S. Smart Machine Power Consumption and Monitoring System in Industry. International Journal of Algorithms Design and Analysis. 2019; 5 (1): 26-30p.Preetha S.Kowsalya G.Malarvizhi M.Manjuthaa N.Manobharathi S.
Copyright (c)
2019-08-072019-08-07822630Design and Implementation of New Control System to Enhance the Usage of Drip Irrigation Water Using Lab View
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=504
Water scarcity is one of the major issues that make the farmers to spend more for the efficient irrigation for their farmlands. The government needs to pay compensation cost in millions to the farmers during drought season. Utmost care should be taken for the effective utilization of the available water source in the present situation. The effective way of irrigation that had been introduced in near decade is the drip irrigation system (DIS). The wastage of water to irrigate crops and plants in the agricultural fields are reduced to 60%–70% at the maximum, but there is approximately 30%–40% of water being not properly utilized. Further, an idea is needed to maximize the utilization and to reduce the wastage of water in drip irrigation. The actual water flow through drip irrigation is 4 gallons per hour (gph). The proposed idea consists of earth electrodes, NI-DAQ 6009, relay module, solenoid valve and aims to reduce the water wastage which is approximately 2 gph while irrigating plants in existing drip irrigation system. So, the optimization made in this proposal idea will help the farmers in water conservation and effective utilization of water. Keywords: drip irrigation, earth resistance, NI-DAQ 6009, water conservation Cite this Article: R. Raj Kumar, K. Sriram. Design and Implementation of New Control System to Enhance the Usage of Drip Irrigation Water Using Lab View. International Journal of Algorithms Design and Analysis. 2019; 5 (1): 19-25p.R. Raj KumarK. Sriram
Copyright (c)
2019-08-072019-08-07821925Automatic Segregation of Wastes Collected from Beaches Using Programmable Logic Controller
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=507
The amount of waste being produced is increasing at a faster rate with the growing population rate. It is also raising a crucial issue at the municipal level to utilize waste material that was being dumped anywhere and everywhere as landfill waste. So it is important to have a structure to utilize waste naturally which is not used presently. This paper proposes an idea about the segregation of metallic and non-metallic waste materials in the beach through a movable vehicle by using inductive proximity sensors, which are mounted on the conveyor and programmable logic controller is intended to automate the system. Keywords: metallic and non-metallic wastes, movable vehicle, PLC, segregation Cite this Article: S. Subhasini, C. Nandhini, N. Rabeeka Fathima, S.J. Suji Prasad. Automatic Segregation of Wastes Collected from Beaches Using Programmable Logic Controller. International Journal of Algorithms Design and Analysis. 2019; 5 (1): 13-18p.S. SubhasiniC. NandhiniN. Rabeeka FathimaS. J. Suji Prasad
Copyright (c)
2019-08-072019-08-07821318Fuel Adulteration Gauge
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=506
In this proposal, a simple technique to check the adulteration of petrol in the vehicle's fuel tank is mentioned. Fuel tank is connected to the density module and the pH meter for comparing the parameters of the fuel filled in the tank. If the pH value drops or rises above the specific pH value of the fuel programmed on the controller, a red LED is switched on. As well as it should also satisfy the respective densities of petrol and diesel. The density of petrol is 0.77 kg/L and for diesel is 0.892 kg/L. Otherwise, a green LED is switched off. The pH value of petrol is mostly neutral, and diesel varies from 5.5 to 8.0. The main advantage of this technique includes restriction of leaded petrol and other adulterants with the purest form of petrol. Keywords: density, LED, petrol, ultrasonic sensor Cite this Article: Vishanth, V. Shrinithi, S. Vijayadharan, V. Raahul, T. Kalavathi Devi. Fuel Adulteration Gauge. International Journal of Algorithms Design and Analysis. 2019; 5 (1): 9-12p.S. VishanthV. ShrinithiS. VijayadharanV. RaahulT. Kalavathi Devi
Copyright (c)
2019-08-072019-08-0782912Detection and Disposal of Medical Waste using IoT
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=483
It is the legal and social responsibility of all the people involved in healthcare activities either directly or indirectly for safe and sustainable management of medical waste. This paper discusses how Information and Communications Technology (ICT) and Internet of Things (IoT) will help in detection and tracking of medical waste and proper disposal and treatment of the same. The paper highlights the use of ICT and IoT in medical waste management in various countries and aims at implementing the same in India. The use of wireless Radio-Frequency Identification to track indoor waste and Global Positioning System for tracking waste outside the premises is being used. Keywords: ICT, GPS, management, medical waste tracking, monitoring, RFID Cite this Article: Aishwarya Shiva Hiremath. Detection and Disposal of Medical Waste using IoT. International Journal of Algorithms Design and Analysis. 2019; 5 (1): 1-8p.Aishwarya Shiva Hiremath
Copyright (c)
2019-08-072019-08-078218Load Balancing Algorithms in Cloud Computing: A Comparative Study
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=443
ABSTRACT An information technology paradigm is Cloud computing, which helps to access shared pools of configurable system resources and higher level of resources. The cloud stores all the data and disseminated resources in an open environmental way, which in turn leads to increases the amount of data storages quickly .so load balancing is major occurring problem in cloud storage, also there is need of low cost to maintain the load information on different nodes. The work load should be scattered and balanced properly among all processing nodes. Now a day’s lot of load balancing algorithms has been proposed for efficient job and resource allotment. If we use resources efficiently we will get optimal profits and we can balance the time, with optimized balancing algorithms. In this study we discuss the analysis of load balancing techniques with aiming to share data, calculations and service transparently over a scalable network of nodes and shows the which is the best algorithm for various considerations including the cost. Keywords— Cloud Computing, Broker Policy, Performance Evaluation, Virtual Machines, Load BalancingPrasad. G. HiremathS. S. SannakkiVijay S Rajpurohit
Copyright (c)
2019-03-072019-03-07822938Implementation of Data Encryption Standard Algorithm Using Verilog
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=442
ABSTRACT: The data encryption standard is a symmetric key algorithm for the encryption of electric data. It is called as symmetric because same algorithm and key are used for encryption and decryption. DES is a block cipher, it encrypts data in 64 bit blocks. 64 bit blocks of plaintext goes in one end of the algorithm and 64 bit block of cipher text comes out other end. The key length is 56 bits. To accomplish encryption, most secret key algorithm use two main techniques known as substitution and permutation. Substitution is a simply a mapping of one value to another. Whereas permutation is a reordering of bit position for each of the inputs. These techniques are used number of times in iteration called rounds. S-boxes are used basically non-linear substitution table, where either the output is smaller than the input. It will be implemented by using the tool Xilinx 13.1. Simulator used is ISE .Language used for this implementation is Verilog. KEYWORDS: DES (Data Encryption Standard), Advanced Encryption Standard (AES), International Business Machines (IBM), National Security Agency (NSA).Ashwini. R. Patil
Copyright (c)
2019-03-072019-03-0782512An Approach on Image Processing of Two-Way Communication for Hearing Impaired and Dumb Person
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=435
ABSTRACT It has been observed that, due to the birth defects, accidents and oral diseases the numbers of deaf and dumb victims are increasing, for them communication is the major problem. They require some sort of smart systems which convert the gestures to speech and vice versa. This paper produces the idea based on designing and implementing a system which uses image processing concepts to take input as hand gestures and generate recognizable output in the form of text and voice, also takes input as a speech and generates recognizable output in the form of gesture images. Keywords: Digital Image Processing, RGB, JAVA, JARSMilind Rao. Pawar
Copyright (c)
2019-03-072019-03-078214Uncontrolled Parameter Dependent Genetic Algorithm – Study and Implementation
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=434
Abstract: We present a concise survey of Genetic Algorithms (GAs) that don't require the manual tuning of their parameters and are in this way called Parameterless Genetic Algorithms (pGAs). There are three fundamental classes of Parameterless GAs: Deterministic, Adaptive and Self-Adaptive pGAs. We additionally portray new parameterless Genetic Algorithm (nGA), one that is straightforward and execute, and which performs exceptionally well on a lot of five standard test capacities. Keywords: Parameterless Genetic Algorithms (pGAs), new parameterless Genetic Algorithm (nGA), Genetic Algorithms (GAs).E. N. Ganesh
Copyright (c)
2019-03-072019-03-07821928Microsoft Azure Cloud: A Study of Support Services in Cloud Computing
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=410
Abstract Microsoft Azure is a cloud computing platform made through Microsoft for constructing, testing, passing, and overseeing packages and administrations thru a worldwide device of Microsoft-oversaw server farms. It offers programming as a management (SaaS), level as an administration and basis as an management and backings a wide variety of programming dialects, instruments and structures, including both Microsoft-unique and outsider programming and frameworks. Sky blue is an in depth association of cloud blessings that engineers and IT specialists use to manufacture, convey and oversee programs through our global system of statistics facilities. Coordinated apparatuses, DevOps and a commercial middle help you in successfully fabricating anything from honest transportable applications to web scale arrangements. This paper concentrated on help given via Microsoft sky blue to cloud administrations .Furthermore , this paper have mentioned the presentation part alongside Microsoft azure segments. Keywords: Microsoft, software, azure, support.Kamini .Tejinderpal Singh Brar
Copyright (c)
2019-01-162019-01-16821318Inductive computations, anytime algorithms and emotions
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=373
In this paper, we show how utilization of emotions enhances and improves two computing technologies – inductive and anytime – combining their advantages in a synthesized approach based on emotions. Inductive computations are used in many areas of computer and network technology. Inductive reasoning forms the base for scientific exploration. A mathematical model of inductive computations and reasoning is called an inductive Turing machine, which is a natural extension of the most popular model of computing devices and computations - Turing machine. In comparison with Turing machines, inductive Turing machines represent the next step in the development of computer science providing better models for contemporary computers and computer networks. Anytime computation enhances the traditional notion of a recursive computational procedure by allowing it to return many possible approximate answers to any given input and choosing which of them is taken as the result of computation. Anytime algorithms use well-defined quality measures to monitor the progress in problem solving and allocate computational resources effectively. The binary notion of correctness is replaced with a multivalued quality measure associated with each answer. Similar to anytime computation, inductive computation also produces a sequence of intermediate outputs and it is necessary to determine the result of a computation. Sometimes the result is determined in the recursive mode when the computing system itself informs that the result is obtained. In other cases, the user has to make a decision that the result is obtained. In this case, it is possible to apply the anytime approach based on quality measures. In some cases, these measures can be exact allowing unique decision. In other cases, only fuzzy measures can estimate intermediate outputs. That is why we suggest utilization of emotions in result determination for modeling decision- making by people. Key words: artificial intelligence, emotion, recursive computation, inductive computation, measure, decision-making, algorithmMark Burgin
Copyright (c)
2018-09-142018-09-14824757An algorithm to Identify Batch Malware and the Performance of such Malware against Modern Antivirus Softwares
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=351
This paper studies computer viruses in batch file format. A batch file is a kind of script file in DOS, OS/2 and Microsoft Windows. It consists of a series of commands to be executed by the command-line interpreter, stored in a plain text file. A batch file may contain any command the interpreter accepts interactively and use constructs that enable conditional branching and looping within the batch file. In this method, viruses aren’t identified by modern antivirus softwares, including Norton Antivirus and Kaspersky. In this paper, we compare the effectiveness of batch viruses along with their corresponding viruses in executable file format. Also a survey was conducted to analyze public knowledge on batch viruses. This paper also explores an algorithm for antiviruses to identify and neutralize batch file viruses.Adithya Vikram Sakthivel
Copyright (c)
2018-08-202018-08-20824046Electric Power Generation Modelling By Using Decimal to Binary Conversion and Genetic Algorithm Technique
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=358
Generation system reliability assessment is a vital factor for planning the power system that can be performed by using probabilistic or deterministic methods. The probabilistic techniques have more advantages over deterministic methods. However, Probabilistic method involves complicated modelling. Generation modelling is a basic requirement for the assessment of reliability of the system. One known form of the generation model is the Capacity Outage Probability Table (COPT). This paper analyses the different techniques involved in developing COPT such as Decimal to Binary Conversion(DBC) and Genetic Algorithm(GA). The COPT is obtained from two techniques are found to be comparable. The techniques used are proven to be an effective approach to build the generation model.Chanabasappa Bheraji
Copyright (c)
2018-08-172018-08-17823339DEVELOPMENT OF AN EFFICIENT STRING MATCHING ALGORITHM FOR LARGE DOCUMENTS SORTING
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=360
This project research took a comparative analysis on string matching algorithm. The study focused on developing an efficient algorithm which will be used for large documents. The algorithm whilst compared to the existing system introduced a pattern of search which was done backwardly, from the last character to the first. The existing system considered more number of shifts was slow and has a bad character shift. The research aimed at developing an efficient string matching for large documents sorting. The research methodology adopted for this project research is the verification and validation methodology which perform its test in a reverse manner so the software developer can each stage review its step. The programming language used to implement the efficient string matching is java programming language. The proposed system was executed and produced a result which compared with the existing system was termed efficiency. The system introduces a faster means of searching which starts form the last character to the first. The developed system which is the efficient string matching algorithm was analyzed and displayed a faster means of searching documents and is termed efficient because of its computing speed of 2 milliseconds while the naïve algorithm which ran with the computing speed of 154 milliseconds for a total number of 5000 characters.Amannah Constance Izuchukwu
Copyright (c)
2018-08-062018-08-06821032WEATHER PREVISION USING WILDER MOVING AVERAGE TECHNIQUE
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=359
Temperature prediction is a transient and time arrangement based process. Exact determining is vital in this day and age as horticultural and modern areas are to a great extent subject to the temperature. Due to non-linearity in climatic material science, neural systems are appropriate to foresee these meteorological procedures. Back proliferation coordinated with hereditary calculation is the most imperative calculation to prepare neural systems. In this paper, keeping in mind the end goal to demonstrate the reliance of temperature on a specific information arrangement, a period arrangement based temperature expectation show utilizing coordinated back proliferation with wilder moving average technique is proposed. In the proposed strategy, the impact of under preparing and over preparing the framework is additionally appeared. The tests after effects of the system are enrolled alongside.Aatif JamshedDr. Pramod KumarDr. Bhawna Mallick
Copyright (c)
2018-08-062018-08-068219Virtual Private Database
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=306
A virtual private database or VPD masks data in a larger database so that only a subset of the data appear to exist, without truly segregating data into different schemes, tables, databases. A characteristic application is compelling sites, departments, individuals, etc. to function only on their own records and at the same time permitting more advantaged users and operations (e.g. reports, data warehousing, etc.) to access on the entire table.Neelima Goyal
Copyright (c)
2018-02-162018-02-16823133Secure Cloud Storage
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=304
AS an emerging technology and business paradigm, Cloud Computing is a commercial technology calculating by storm. Cloud computing provides us easy access to a company’s high-performance storage and computing structure through web services. We ruminate the drawback of building to protect cloud storage service on top of the public cloud setup where the service’s provider is not finished by the right hand of the customer . We explain, at the high level, numerous architectures that fuse new and non-traditional cryptographic primitives in order to attain our aims. We examine the reimbursements, such a design would be provided to the both customer’s and service provider’s and provide a summary of fresh expansions in cryptography inspired particularly by cloud storage.Sakeena Gul
Copyright (c)
2018-02-162018-02-16822330SKINPUT
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=305
Skinput is an info innovation that utilizations bio-acoustic detecting to confine finger taps on the skin. At the point when enlarged with a Pico-projector, the gadget can give an immediate Manipulation, graphical UI on the body. While different frameworks, similar to Sixth Sense have endeavored this with PC vision, Skinput utilizes acoustics, which exploits the human body's common sound conductive properties (e.g., bone conduction). This allows the body to be annexed to an input surface without the need for the skin to be invasively instrumented with sensors, tracking markers, or other items. I have introduced a way to deal with fitting the human body as an info surface.Aastha Alica Das
Copyright (c)
2018-02-162018-02-16821822Survey on Error Handling using Service Oriented Architecture
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=310
: Service-Oriented Architecture (SOA) is a popular design paradigm for distributed systems. Services are performing an increasingly important role in modern application development and composite application. One may ask how to successfully implement SOA. The objective of the study to examine the key issues of the user's negative attitude towards introduction of SOA design. It is the fear of complexity that the Service-Oriented Architecture (SOA) brings with its layers. Most of the composite applications needed to be reliable and available, however it may appear more difficult to achieved, due to the multi-layered architecture of SOA. To reduce the fear of complexity, to reduce the risk as well as to generate light weight message usable by all types of clients (users) when introducing SOA architecture, it is necessary to use error handling and recovery methods in order to increase system fault tolerance This topic looks at various error handling considerations associated with design of reusable services.Pratima Kedarnath Yadav
Copyright (c)
2018-02-162018-02-16821117A Multi-Query Optimization Algorithm Using Map Reduce
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=288
The need for storing statements about web resources lead to the emergence of the semantic web technology. A World Wide Web Consortium (W3C) standard for storing the semantic web data is Resource Description Framework (RDF). The existing frameworks do not provide scalability for large RDF graphs. This paper focuses on the problem of multi-query optimization of semantic web data. A scalable framework for storing RDF graphs is designed using Hadoop Distributed file system and the problem of multi-query optimization in the perspective of SPARQL is revisited in this research. Algorithms for multi-query optimization is proposed and query execution is done through map reduce programming to get the final result of optimized query. Experiments were conducted on the LUBM benchmark dataset. The algorithm is executed on Jena data store and the Hadoop framework. The extent to which the algorithm is efficient and scalability is tested and the results are documented. Keywords: hadoop, map reduce, query optimization, resource description framework, semantic webR GomathiS LogeswariB Gomathy
Copyright (c)
2017-12-152017-12-1582110Algorithmseer: A System for Extracting and Searching for Algorithms in Scholarly Big Data
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=217
In this project, Identification and extraction of varied informative entities from learned digital documents is an active area of research. For algorithm discovery in digital documents, and described a method for automatic detection of pseudo-codes (PCs) in Computer Science publications. Their method assumes that each PC is accompanied by a caption. Such a PC can be identified using a set of regular expressions to capture the presence of the accompanied caption. However, such an approach is limited in its coverage due to reliance on the presence of PC captions and wide variations in writing styles followed by different journals and authors. PCs are commonly used in scientific documents to represent algorithm, a majority of algorithms are also represented using algorithmic procedures (APs). An algorithmic procedure is a set of descriptive algorithmic instructions and differs from a PC in the following ways: Writing Style and Location in Documents. The existing algorithms represented in documents do not conform to specific styles and written in arbitrary formats, this becomes a challenge for effective identification and extraction. A novel proposed methodology based on ensemble machine learning to discover algorithm representations such as PCs and APs automatically. Moreover, observe that two or more algorithm representations may be used to describe the same algorithms. Hence, we also proposed a simple heuristic that links different algorithm representations that together constitutes an algorithm. Automatic discovery and extraction of these algorithm representations will be useful for applications in digital libraries and document engineering. This project represents an automatic description method that first supports the data units on a result page into different groups such that the data in the same group have the same semantic. Then, for each group we annotate it from different aspects and aggregate the different annotations to predict a final annotation label for it. An annotation wrapper for the search site is inevitably created and can be used to interpret new result pages from the same web database. The project proposes a clustering-based shifting technique to align data units into different groups so that the data units inside the same group have the same semantic. Instead of using only the DOM tree or other HTML tag tree structures of the SRRs to support the data units (like most current methods do), the approach also considers other important features shared among data units, such as their data types (DT), data contents (DC), presentation styles (PS), and adjacency (AD) information. The experiments indicate that the proposed approach is highly effective.A KavithaR RanjanaM NandhiniD Kiruthika
Copyright (c)
2017-09-272017-09-278216Replicants the Tone for Real Images
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=99
Excessive dynamic range (HDR) is a method that allows a terrific dynamic range of luminance between the lightest and darkest region of an picture. For video compression, the HDR series is reconstructed by using inverse tone-mapping a compressed low dynamic range (LDR) model of the unique HDR content material. We display that the ideal desire of a Tone-mapping operator (TMO) can significantly improve the reconstructed HDR quality. It is united state side to compress a large range of pixel luminance in to smaller range that is suitable for display on devices with limited dynamic range. We formulate a numerical optimization problem to find the tone-curve that minimizes the expected mean square error (MSE) in the reconstructed HDR sequence. We also develop a simplified model that reduces the computational complexity of the optimization problem to a closed-form solution. It is also shown that the LDR picture fine due to the proposed techniques suits that produced by using perceptually-primarily based TMOs.B M Alaudeen
Copyright (c)
2017-09-272017-09-27822534Unbiased-Weighted-Crawl Algorithm Based Aggregate Estimation in Hidden Databases With Checkbox Interfaces
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=222
A large number of web data receptacles are hidden behind restrictive web interfaces, making it an important challenge to enable data analytics over these hidden web databases. This module is used to enabling the aggregate queries over a hidden database with checkbox interface by issuing a small number of queries (sampling) through its web interface. That this approach will be handled in both synthetic and real datasets demonstrate the accuracy and efficiency of the algorithms. To enable the approximation processing of aggregate queries and develops algorithm UNBIASED-WEBIGHTED-CRAWL which performs random drill-downs on a novel structure of queries which referred as a left-deep tree and also propose weight adjustment and low probability crawl to improve estimation accuracy.P MallikaM AshwinA.M. Ravishankkar
Copyright (c)
2017-05-182017-05-18822831Secure Image Transmission Framework for Health Care Services
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=221
Medical image transmission is very important task in the telemedicine and remote health services. Authentication, confidentiality and integrity are the key issues for the secure data transmission process. The data source verification is carried out under the authentication process. Data encryption and decryption operations are carried out under the confidentiality tasks. The data integrity verification is performed to check the original data with received data values. The watermark technique integrates all the data security operations. Watermark-based data authentication and integrity verification is supported to secure the medical image transmission process. The data confidentiality is ensued with the Cryptography techniques. Data encryption and decryption operations are carried out to secure the patient details. The hybrid algorithm (HA) integrates the watermark and cryptography techniques. The medical image transmission is protected with the HA. The patient health records (PHR) are maintained in the secret text files. The secret text file provides the details about the patient and the medical image information. The data hiding methods are applied to hide the secret text into the cover image values. The secure medical image transmission system is constructed with the data hide and cryptography techniques. The enhanced hybrid algorithm (EHA) is composed with the integration of the data hide, watermark and cryptography techniques. The secret text is encrypted with the RSA algorithm. The encrypted secret text is hided into the medical image. The least significant bit (LSB) encoding scheme is used to handle data hide/unhide operations. The watermark is used to carry the authentication and data integrity verification operations. The system is developed using Java front end and Oracle relational database environment.K SuganyaG Malathi
Copyright (c)
2017-05-182017-05-18821927Multikeyword Ranked Search Based on Hierarchical Clustering Index
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=220
Outsourcing of data into cloud becomes an effective trend in modern day computing due to its ability to provide low-cost, pay-as-you-go IT services. Cloud data owners prefer to outsource documents in an encrypted form for the purpose of privacy preserving. Therefore, it is necessary to develop effective and reliable cipher text search techniques. One challenge is that the relationship between documents will be normally concealed in the process of encryption, which will lead to significant search accuracy performance degradation. Also the volume of data in data centers has experienced a dramatic growth. This will make it even more challenging to design cipher text search schemes that can provide efficient and reliable online information retrieval on large volume of encrypted data. In this paper, a hierarchical clustering method is planned to support more search semantics and also to meet the demand for fast cipher text search within a big data environment. The proposed hierarchical approach clusters the documents based on the minimum relevance threshold, and then partitions the resulting clusters into sub-clusters until the constraint on the maximum size of cluster is reached. In order to confirm the authenticity of search results, a structure called minimum hash sub-tree is designed in this paper. The results show that with a sharp increase of documents in the dataset the search time of the proposed method increases linearly whereas the search time of the traditional method increases exponentially. Furthermore, the implemented method has an advantage over the traditional method in the rank privacy and relevance of retrieved documents.V LalithaK Gandhimathi
Copyright (c)
2017-05-182017-05-18821418Crop Yield Forecasting for Certain Agriculture Products and Marketing
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=219
Agriculture is one of the major revenue producing sectors of India and a source of survival. Various seasonal, economic and biological factors influence the crop production but unpredictable changes in these factors lead to a great loss to farmers. These risks can be quantified when appropriate mathematical or statistical methodologies are applied on Data related to soil, weather and past yield are required for this process. With the advent of data mining, crop yield can be predicted by deriving useful insights from these agricultural data that aids farmers to decide on the crop they would like to plant for the forthcoming year leading to maximum profit. In this paper, ARIMA clustering algorithm is used for price forecasting and Multiple Linear Regression algorithm is used for yield prediction. From the experimental results, we have forecasted the yield and price of certain crops.R. MadhumathiB.H. DhivyaR. ManjulaS Siva Bharathi
Copyright (c)
2017-05-182017-05-1882713A Quantitative Combination of C-S Edge Detectors Specifically for Edge Detection of Gray-Scale Images
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=43
Abstract We can define an edge of an image as a set of connected pixels that creates boundaries of several objects in an image. Generally, edge detection is a process of segmenting an image into regions of discontinuity. Edge detection takes a greater role in digital image processing as well as practical aspects of our life. Edges of an image can be generated by implementing several edge detector mechanisms. In this paper, an emerging edge detection mechanism that computes edges of different images using the quantitative combination of Canny-Sobel (C-S) edge detectors is presented. Here the proposed methodology is compared with Canny and Sobel edge detectors separately. Keywords:-Edge detection, digital image processing, Canny-Sobel (C-S) edge detectorsKalyan Kumar JenaSasmita Mishra
Copyright (c)
2016-12-292016-12-29823944Designing for Fingerprint Image Enhancement
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=115
In this paper UML diagrams of fingerprint image enhancement are shown. The Unified Modeling Language is a standard visual modeling language. There are various types of UML diagrams such as Class diagram, Use Case diagram, Sequence diagram, Collaboration diagram. The data flow diagrams are also shown. The software engineering process model i.e. Waterfall model is also described.Keywords: data flow diagrams (DFDs) waterfall model, unified modeling language (UML).Mayur Patil
Copyright (c)
2016-12-292016-12-29823338A Novel Noise Removal Method for Lung CT SCAN Images Using Statistical Filtering Techniques
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=49
Image denoising could be a procedure in digital image process aiming at the removal of noise, which can corrupt a picture throughout its acquisition or transmission whereas retentive its quality. Medical image sweetening technologies have attracted a lot of attention since advanced medical instrumentation were place into use within the medical field. Increased medical pictures area unit desired by an operating surgeon to help designation and interpretation as a result of medical image qualities area unit usually deteriorated by noise and different information acquisition devices, illumination conditions, etc. Our targets for medical image sweetening area unit is principally to resolve issues of the high-level noise of a medical image. The noise gift within the pictures could seem as additive or increasing elements and also the main purpose of denoising is to get rid of these creaking elements whereas protective the vital signal the maximum amount as doable. During this paper we have a tendency to analyze the denoising filters like Mean, Median, Midpoint, Wiener filters and also the 3 additional changed filter approaches for the respiratory organ CT scan pictures to get rid of the noise gift within the pictures and compared by the standard parameters. Keywords: Medical images, noise removal, filtering approach, statistical filters, quality measuresS SivakumarC Chandrasekar
Copyright (c)
2016-12-292016-12-29822532Designing of Vehicle Insurance Platform by using Java as Script—Part I
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=199
Recently, states have started passing laws that electronic versions of proof of insurance will currently be accepted by the authorities. Customers could also be protected by totally different levels of coverage betting on that insurance they purchase. Some states need drivers to hold a minimum of insurance coverage to make sure that their drivers will cover the price of injury to people or property within the event of associate accident.Shefali Handa
Copyright (c)
2016-12-292016-12-2982824Algorithms, Design and Analysis of Brain Fingerprinting
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=196
Brain fingerprinting depends on finding that the mind produces a one of a kind mind wave design when a man experiences a well-known boost Use of practical attractive reverberation imaging in untruth location gets from studies proposing that people solicited to lie show diverse examples from mind movement than they do while being honest. Issues identified with the utilization of such proof in courts are talked about. The creator presumes that neither one of the approaches is at present bolstered by enough information with respect to its exactness in identifying trickiness to warrant use in court. This test utilizes what Farwell calls the MERMER reaction to identify commonality response. One of the applications is falsehood identification. Dr. Lawrence A. Farwell has designed, created, demonstrated, and licensed the strategy of Farwell Brain Fingerprinting, another PC based innovation to recognize the culprit of a wrongdoing precisely and experimentally by measuring mind wave reactions to wrongdoing applicable words or pictures introduced on a PC screen. Farwell Brain Fingerprinting has demonstrated 100% precise in more than 120 tests, including tests on FBI operators, tests for a US insight organization and for the US Navy, and tests on genuine circumstances including real wrongdoings.Avinash Chauhan
Copyright (c)
2016-12-292016-12-298217Secured Collaborative Environment for Accessing Big Data
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=47
Abstract In this paper, we discuss about what is big data, and what are the main features of the big data. What are the main issues we are facing to access the big data. What are the main features of the collaborative environment to access to big data. And how to provide the security of the big data in a collaborative environment. Keywords: Big data, features, collaborative environmentG. Dileep KumarR Praveen Sam
Copyright (c)
2016-12-282016-12-28821924Some Results on APS-Injective Rings and Modules
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=57
Abstract R is a right APS-injective ring and aR is a principal right ideal contained in J(R), such that aR is isomorphic to a direct summand of R, then aR = (0). We also show that if R is a right APS-injective ring, any principal right ideal contained in Jacobson radical is projective if and only if it is a direct a summand of the ring. Finally we study an example of a left APS- injective ring which is not right APS-injective. Keywords: APS-injective rings and modules, semi primitive rings, baer ringsSoumitra Das
Copyright (c)
2016-12-282016-12-28821418Testing of Vehicle Insurance Platform by Using Java as Script: Part II
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=194
Testing is very important aspect to complete any process in any industry or companies.Shefali Handa
Copyright (c)
2016-12-282016-12-28821013System Analysis and Design for the Automation of Centralized Library Activities
https://computers.journalspub.info/index.php?journal=JADA&page=article&op=view&path%5B%5D=193
The current existing working system of the State Central Library is done manually. The System cannot be enhanced to provide the ever increasing members, books, records etc. For all these years the work of the Library is done manually. Issuing of books, renew of books, book return, buying new books, furniture etc. are done daily on papers. This article is depicting how system is design to centralize any library.Roger Ahongshangbam
Copyright (c)
2016-12-282016-12-288219