Saturday 20 April 2013

Slicing of UML Architectural Models

www.imanagerpublications.com
Vol.7 No.2
Year: 2012
Issue: October-December
Title: Slicing of UML Architectural Models   
Author Name: Sasikala Jayaprakash, S. Narayanan   
Synopsis:   
Dynamic slicing technique is a proposed technique for slicing the architectural model.  The presence of related information in diverse model parts makes dynamic slicing of unified modeling language (UML).  In most cases UML model need to be converted into intermediate representation.  These intermediate representation forms a data structure to be manipulated by the algorithm of specific goals.  Various intermediate representation and associated algorithms produce results of slicing with its salient effectiveness.  Slicing technique is also used to produce the impact analysis among the various model elements in different architecture diagram.  Specifying the slicing criteria is another aspect which can be observable through the previous works.  This paper summarizes all the previous works with their results and methodology.

A Novel Algorithm for Generalized Image Denoising Using Dual Tree Complex Wavelet Transform

Vol.7 No.2
Year: 2012
Issue: October-December
Title: A Novel Algorithm for Generalized Image Denoising Using Dual Tree Complex Wavelet Transform 
Author Name: SK. UMAR FARUQ, K.V. Ramanaiah , K.Soundara Rajan   
Synopsis:   
 With an unfortunate hardship in preserving the significant image content of interest, from an often contamination of noise due to evincing facts like internal element imperfections, scarce of illumination and digitization intrinsic to sensors (CCD Cameras),in addition to the environmental conditions and alignment which are extrinsic in wide variety of applications including satellite television, magnetic resonance imaging, computer tomography as well as in areas of research and technology such as geographical information systems and astronomy have lead its wings to be opened towards an evergreen application of essence in image  processing ,i.e., image denoising. Image denoising is primary task prior to any high level image processing operation, with an underlying goal to remove noise while preserving edges, is still hard striking problem, in a solution to which several algorithms with their specific assumptions, advantages and limitations have been published and  due to their inherent averaging leading to the loss of significant image features of interest in high frequency image denoising. In this paper we discuss the importance of nearly shift invariant, directional selective, dyadic decomposition tree based dual tree complex wavelet transform (DT-CWT) and an intelligent filter module (IFM) which can make the decision of selecting the filter type to denoise the image, to remove the resulting blur based on noise type and produces enthusiastic results in terms of psycho visual quality and performance metrics than those produced by previous tools and techniques.

Automatic Region Detection of Facial Feature using Haar Transform

Vol.7 No.2
Year: 2012
Issue: October-December
Title: Automatic Region Detection of Facial Feature using Haar Transform   
Author Name: N.S Priya   
Synopsis:   
This paper proposes Automatic region detection of facial features in an image that can be important stage for various facial image manipulation works, such as face recognition, facial expression recognition, 3D face modeling and facial features tracking. Region detection of facial features like eye, pupil, mouth, nose, nostrils, lip corners, eye corners etc., with different facial image with neutral region selection and illumination is a challenging task. In this paper, we presented different methods for fully automatic region detection of facial features. Object detector is used along with haar-like cascaded features in order to detect face, eyes and nose. Novel techniques using the basic concepts of facial geometry are proposed to locate the mouth position, nose position and eyes position. The estimation of detection region for features like eye, nose and mouth enhanced the detection accuracy effectively. An algorithm, using the H-plane of the HSV color space is proposed for detecting eye pupil from the eye detected region. Proposed algorithm is tested over 100 frontal face images with two different facial expressions (neutral face and smiling face).

Revisiting the experiment on Content Based Image Retrieval System Using Aliveness Detection

Vol.7 No.2
Year: 2012
Issue: October-December
Title: Revisiting the experiment on Content Based Image Retrieval System Using Aliveness Detection   
Author Name: Dwarkoba Gaikwad   
Synopsis: 
The internet technology is increasing rapidly in the society and industry. In information technology, the people are searching information based on text and images. There are many techniques to extract the required information from the raw data which is in the form of text and images. There are many information searching engines such as Google which mostly use the text-based retrieval techniques. The text based retrieval is used for getting the text information only. But if you want the information in text and image form, then only text based information is not efficient. In the recent years, content based image retrieval techniques have been proposed to search the text and image collectively. Due to its importance in information technology, we have discussed all aspects regarding Content Based Image Retrieval in detail. The objective of this paper is to study the different existing Content Based Image Retrieval techniques and their applications. Our findings are based on reviews of the relevant literature survey which will be very useful for a researcher who is new in Content Based Image Retrieval techniques. We also have presented the content based image retrieval system which is developed at our site. The system is based on similarity measurement of color histograms of image. We have used our own image database of college annual gathering and Engineering Today 2011 technical event for testing the system. We found that the system works properly and gives excellent results. The algorithm is suggested by extending the same experiment which can help for making album.

Tuesday 16 April 2013

Analysis on fuzzy membership functions for image segmentation using ultrafuzziness

Vol.7 No.1
Year: 2012
Issue: July-September
Title: Analysis on fuzzy membership functions for image segmentation using ultrafuzziness   
Author Name: seetharama prasad, Kolluri Raju , Ch Venkata Narayana   
Synopsis:   
In this paper, a study on  fuzzy membership functions   for image segmentation using ultrafuzziness is conducted. In this work, Tizhoosh membership function which is totally supervised, Huang & Wang membership function and S-function are   considered. This work is an improvement of an existing work of Tizhoosh. Each membership function has its own merits and demerits in the computation process.  Using fuzzy logic concepts, the problems involved in finding the minimum/maximum of a entropy criterion function are avoided. We attempt to make it clear that identifying the better membership function to assign the fuzzy membership grade to every pixel in the image, for optimum image segmentation using ultrafuzziness. For low contrast images contrast enhancement is assumed. Experimental results demonstrate a quantitative improvement with S-function over other two other functions.


Medical Data Handling Using Cloud Computing And A Proposal for Countrywide Medical System

Vol.7 No.1
Year: 2012
Issue: July-September
Title: Medical Data Handling Using Cloud Computing And A Proposal for Countrywide Medical System   
Author Name: Samayita Bhattacharya, Kalyani Mali   
Synopsis:   
This paper focuses on hosting and analyzing medical diagnostic data using cloud computing. Cloud computing is a general term for anything that involves delivering hosted services over the Internet. This is a project proposal for medical database system using cloud computing. The proposed database system can provide new delivery models to make healthcare more efficient and effective, and at a lower cost to technology budgets.

Integration of Color and Texture features for Content Based Image Retrieval

Vol.7 No.1
Year: 2012
Issue: July-September
Title: Integration of Color and Texture features for Content Based Image Retrieval   
Author Name: kandala lakshmi aparna, Venu Gopala Rao   
Synopsis:   
This paper presents a new image indexing and retrieval algorithm by combining the color (RGB histogram) and texture feature (local derivative patterns (LDPs). Texture feature, LDP extracts the high-order local information by encoding various distinctive spatial relationships contained in a given local region. Color features, histogram extracts the distribution of various colors in an image. The experimentation has been carried out for proving the worth of our algorithm. It is further mentioned that the database considered for experiment is Corel 1000 databased. The results after being investigated show a significant improvement in terms of their evaluation measures as compared to LDP, RGB histogram.

Segmentation of Brain MRI Images for Tumor extraction by combining k-means clustering and Watershed algorithm

Vol.7 No.1
Year: 2012
Issue: July-September
Title: Segmentation of Brain MRI Images for Tumor extraction by combining k-means clustering and Watershed algorithm   
Author Name: kailash sinha, G.R. Sinha   
Synopsis:   
In medical image processing, brain tumor extraction is one of the challenging tasks; since brain image are complicated and tumor can be analyzed only by expert physicians. The location of tumors in the brain is one of the factors that determine how a brain tumor effects an individual’s functioning and what symptoms the tumor causes.  We have proposed a methodology in this paper that integrates k-means clustering and watershed algorithm for tumor extraction from 2D MRI (magnetic resonance imaging) images. The use of the conservative watershed algorithm for medical image analysis is pervasive because of its advantages, such as always being able to construct an entire division of the image. On the other hand, its disadvantages include over segmentation and sensitivity to false edges. The k-means clustering algorithm is used to produce a primary segmentation of the image before we apply watershed segmentation algorithm to it; which is an unsupervised learning algorithm, while watershed segmentation algorithm makes use of automated thresholding on the gradient magnitude map. It can be observed that the method can successfully detect the brain tumor size and region.


A Survey on Testing Strategies for User Interface and its Applications

Vol.7 No.1
Year: 2012
Issue: July-September
Title: A Survey on Testing Strategies for User Interface and its Applications   
Author Name: Ashwin Karthick, P.V.S.Sarma , S. Amarnath Babu , P. HARINI , A.S.A.L.G.G. Gupta   
Synopsis:   
User interface design is a subset of a field of study called interaction with computer. A user interface is a collection of techniques and mechanisms to interact with something. In a graphical interface, the interaction mechanism is a pointing device of some kind. Interacts with is a collection of elements referred to as objects. Event-Driven  Software  (EDS)  can  change   state  based  on  incoming   events  common  examples   are  GUI  and  web applications.  GUI Testing is to check the look and feel of the application. UI Testing is the user interface testing which is done in front of the user. There are various tools are available for automated GUI testing and web application testing. The web application is built using asp, jsp, php, servlet. Here our specific contribution is to develop a single testing tool for testing both GUI and Web Applications together. GUI is built through the java technology. Various GUI and web based testing tools are compared.

Monday 15 April 2013

Predicting Object Oriented Software Systems Maintainability At Design Level Using K Means Clustering Technique

Vol.6 No.4
Year: 2012
Issue: April-June
Title: Predicting Object Oriented Software Systems Maintainability At Design Level Using K Means Clustering Technique   
Author Name: Dr. Anil Kumar Malviya, VINOD KUMAR YADAV   
Synopsis: 
Software maintenance is single most expensive activity in entire software development. One way to control the maintenance cost is to utilize software metrics during design phase of development. This paper examined application of K-means clustering technique for identifying the maintainable classes using object-oriented metrics. In this work data clustering technique’s K-means clustering is used to evaluate a software system’s maintainability of Object oriented system based model mainly UIMS (User Interface Management System and QUES (Quality Evaluation System) class’s data. Among the clustering techniques, K-means or Partition clustering will construct non overlapping groups. In this paper we are not only present preliminary experimental work of software maintenance using software metrics for the sample data is being simulated on Matlab but also present the significant level of goodness of clusters using Chi-square Test. . Experimental results on MatLab shows that the algorithm is able to decide the cluster with goodness of fit among clusters using Chi-Square Test that provides the help to the software designers and maintainers to take the appropriate action at design level. It can also be used by software designer to change or modify the design of difficult to maintain classes at design level of software.

Improving Performance of MANETS by Optimized-AODV (OAODV) Protocol Technique by Configuring NS-2

Vol.6 No.4
Year: 2012
Issue: April-June
Title: Improving Performance of MANETS by Optimized-AODV (OAODV) Protocol Technique by Configuring NS-2 
Author Name: Vishal Sharma, Tanu Preet Singh, Singh   
Synopsis:                                 
Network simulator is the standard simulator used for designing of new protocols as per considered as best one to give the correct results. NS-2 is the simulator that works in three phases. One is the TCL scripting and other two are the high level language based such as C++ or JAVA. This paper deals with the concept of how to make necessary coding and the various type of scripting for showing optimization of energy and bandwidth in MANETs. This paper also shows how the throughput, bandwidth and energy efficiency of the network structure is optimized by using the NS-2 simulator.  The AODV protocol is an extensive protocol being used for routing in ad hoc networks but it does not provide support to error recovery and dead state notation and for this we modified the existing AODV protocol to form its modified version termed as OADOV (Optimized —AODV)

A Modified Ant System using Gaussian Probabilistic Pheromone Updation Technique

Vol.6 No.4
Year: 2012
Issue: April-June
Title: A Modified Ant System using Gaussian Probabilistic Pheromone Updation Technique                                
Author Name: Anirban Pal, Debarghya Das, Abhishek Paul   
Synopsis:   
Ant Colony Optimization (ACO) is mainly inspired by the foraging behavior of ants. In this paper, we have proposed a modified model for ant system, entitled as Gaussian Probabilistic Ant System (GPAS) for probabilistic pheromone updating. This proposed algorithm is implemented by incorporating a probabilistic property in the pheromone trail deposition factor, stated as ? (rho).We use the equation proposed by Karl Friedrich Gauss, well-known mathematician and physical scientist, in our GPAS, for updating ?. Trail deposition factor, ?, is in general a static factor and here it has been made probabilistic so as to increase the effectiveness of the ant system in finding the optimal tour for Traveling Salesman Problem (TSP). GPAS modifies its properties in accordance to the requirement of surrounding domain and for the betterment of its performance in dynamic environment. The experimental evaluation conducted to find out the usefulness of the new strategy, using selective benchmark problems from TSP library [6]. Our algorithm shows effective and comparable results as compared to other existing approaches.

Intrusion Detection System for Relational Databases

Vol.6 No.4
Year: 2012
Issue: April-June
Title: Intrusion Detection System for Relational Databases                                    
Author Name: Dr. S. Jeya, S. Muthu Perumal Pillai   
Synopsis: 
Intrusion detection system for relational database is responsible for issuing a suitable response to an anomalous request. We propose the notion of database response policies to support our intrusion response system tailored for a DBMS. Our interactive response policy language makes it very easy for the database administrators to specify appropriate response actions for different circumstances depending upon the nature of the anomalous request. The two main issues that we address in context of such response are that of data matching, and data administration. We propose a novel Joint Threshold Administration Model (JTAM) that is based on the principle of separation of duty. The key idea in JTAM is that a policy object is jointly administered by at least k database administrator (DBAs), that is, any modification made to a policy object will be invalid unless it has been authorized by at least k DBAs. We present design details of JTAM which is based on a cryptographic threshold signature scheme, and show how JTAM prevents malicious modifications to policy objects from authorized users. We also implement JTAM in the PostgreSQL DBMS, and report experimental results on the efficiency of our techniques

A Comparative study of Software Quality Prediction Techniques for Object Oriented System

Vol.6 No.4
Year: 2012
Issue: April-June
Title: A Comparative study of Software Quality Prediction Techniques for Object Oriented System   
Author Name: Dharmendra Lal Gupta, Dr. Anil Kumar Malviya   
Synopsis:   
Quality is the fundamental requirement for a user of a product that’s why it is the moral responsibility of a quality producer to understand it and produce it. Prediction of software quality can only be possible either on the basis of historical data gathering during implementation of same or identical software projects or it can be made using design metrics collected during design phase of SDLC (Software Development Life Cycle).With the help of such a prediction technique one can at least roughly predict quality of the next iteration for the required system. In recent years the key challenges for quality prediction system have grownup only due to tremendous growth of customers and products. In this survey paper we have discussed and compared different software quality prediction techniques by which we can improve the quality of a software based on the object oriented paradigm with the others and collects them at a single place. This study provides a better comparative analysis to select an appropriate approach according to our need.

Tuesday 9 April 2013

Data Mining Techniques and Applications- A Review

Vol.6 No.3
Year: 2012
Issue: January-March
Title: Data Mining Techniques and Applications- A Review   
Author Name: Venkatesan Thillainayagam   
Synopsis:   
Data mining is the process of extracting patterns from data. Basically Data mining is the analysis of observational data sets to find unsuspected associations and to sum up the data in new ways that are both clear and useful to the data owner .It is seen as an increasingly important tool by modern business to transform data into business intelligence giving an informational advantage. The automated, prospective analyses offered by data mining move beyond the analyses of past events provided by retrospective tools typical of decision support systems. The review paper discusses few of the data mining techniques, algorithms and some of the organizations which have adapted data mining technology to improve their businesses and found excellent results. Data mining tools can answer business questions that traditionally were too time consuming to resolve. Data mining is becoming increasingly common in both the private and public sectors. Industries such as banking, insurance, medicine, and retailing commonly use data mining to reduce costs, enhance research, and increase sales. To be successful, data mining still requires skilled technical and analytical specialists who can structure the analysis and interpret the output that is created.

Improving Customer Relationship Management in Electronic Transaction Expansion in Banking Sector by using Data mining techniques

Vol.6 No.3
Year: 2012
Issue: January-March
Title: Improving Customer Relationship Management in Electronic Transaction Expansion in Banking Sector by using Data mining techniques   
Author Name: Bhaskar Reddy Muvva Vijay, Luel Brahen   
Synopsis:   
Today, many businesses such as banks, insurance companies, and other service providers realize the importance of Customer Relationship Management (CRM) and its potential to help them acquire new customers retain existing ones and maximize their lifetime value. Data mining gives an opportunity, uses a variety of data analysis and modeling methods to specific trends and relationships in data detection. This helps to understand what a customer wants and anticipate what they will do. In this paper we examines, the application of k-means clustering and classification decision tree J48 algorithm of data mining on CRM in the case of EFT of POS service of the Dashen Bank S.C. These have been discovered within the framework of CRISP-DM model. The results demonstrate the final dataset consists of 110000 records in which different clustering models at k values of 6, 5, and 4 with different seed values have been traced and evaluated against their performances. Thus, the cluster model at k value of 6 with default seed value has shown a better performance by using Weka-3-7-2 tool.

Mathematical Modeling of Markovian Queuing Network with Repairs, Breakdown and fixed Buffer

Vol.6 No.3
Year: 2012
Issue: January-March
Title: Mathematical Modeling of Markovian Queuing Network with Repairs, Breakdown and fixed Buffer 
Author Name: Mamatha E, C.S. Reddy, Ramakrishna Prasad   
Synopsis:   
Present days, various practical queuing systems extensively used in computing and communication have finite capacities and in such systems, servers are prone to failures. Queuing networks are widely used in the modeling of transaction processing systems, and their interactions among nodes in communication networks. The performance modeling of a multi-node system, with heterogeneous nodes, each node serving external as well as routed internal arrivals of jobs is considered in this paper. Results obtained using the analytical model are analyzed.

A Heuristic Technique for Automated Test Cases Generation from UML Activity Diagram

Vol.6 No.3
Year: 2012
Issue: January-March
Title: A Heuristic Technique for Automated Test Cases Generation from UML Activity Diagram   
Author Name: A.V.K. Shanthi, G. Mohan Kumar   
Synopsis:   
In software development, testing plays an important role. Software testing is an important phase that ensures the quality of the software. This paper proposes a heuristic technique to test the software at the initial stage itself so that it will be easy for software testers to test the software in the later stages. Here test cases are an important entity or criteria by which software is being evaluated. Though Test cases can be generated by various approaches, Unified Modeling Languages attracts the recent researches and industrialists. This paper focus on test case generation by means of UML Activity diagram using Genetic Algorithm which best test cases are optimized and the test cases validated by prioritization.  The test cases generated using our approaches are capable of detecting more faults like synchronization faults, loop faults unlike the existing approaches A case study is used to illustrate the approach.

Email Security Using Two Cryptographic Hybrids of Mediated and Identity-Based Cryptography

Vol.6 No.3
Year: 2012
Issue: January-March
Title: Email Security Using Two Cryptographic Hybrids of Mediated and Identity-Based Cryptography   
Author Name: Sufyan T. Faraj, Hussein Khalid Abd-alrazzaq   
Synopsis:   
The security of email can be considered one of the important issues for scientific research since the nineties of the last century. This is mainly comes from the wide use of e-mail for exchanging various kinds of information especially that some of them are important or sensitive. Although there have been several solutions offered to solve this problem but we still facing the fact that most email messages sent so far have been without any security. The main reason behind this is that, the previous systems relied on the traditional public key cryptography were so complicated from usability point of view for most users.  In this work, we exploit the use of Identity-Based Cryptography (IBC) for solving this usability problem. Indeed, to further increase the system strength, IBC has been combined with mediated RSA cryptography. Our proposal includes the deployment of the two promising hybrids of mediated IBC. In both of these hybrid cryptographic systems, all operations of encryption/decryption and signature/verification have been considered. The proposed system has met the design objectives either totally or partially.  We beileve that our proposed hybrids for mediated IBC can be very helpful in simplifing the use of e-mail security so that to increase the number of users of such systems.

Monday 8 April 2013

Biosorption Of As(V) From Contaminated Water Onto Tea Waste Biomass: Sorption Parameters Optimization, Equilibrium And Thermodynamic Studies

Vol.6 No.2
Year: 2011
Issue: October-December
Title: Biosorption Of As(V) From Contaminated Water Onto Tea Waste Biomass: Sorption Parameters Optimization, Equilibrium And Thermodynamic Studies   
Author Name: Surajit Kundu, S. Suresh, C.B. Majumder , S. Chand   
Synopsis:   
Removal of As(V) ion from contaminated water by biosorption using low cost tea waste (TW) biomass was investigated. The characteristic of TW as biosorbent material was analyzed by Fourier transmission infrared spectroscopy (FTIR) and Scanning electronic microscopy (SEM) to assess its surface properties. The effects of pH, temperature, dosage and contact time on the biosorption were studied and analyzed with spectrophotometer at a wavelength (?) of 540 nm by crosschecking with ICP-MS in this work.  The optimum pH and temperature for the efficiency of biosorption of As(V) were found to be 4 and 35oC. Maximum uptake capacity of As(V) ion was obtained as 2.12 mg/g at initial concentration of 100 mg/l, 8 g of dosage and contact time of 8 hours, respectively. Equilibrium data were well described by the Freundlich isotherm model. The positive values of ?S° (kJ/mol.K) and ?H° (kJ/mol) indicates the endothermic nature of the biosorption process and the spontaneous nature of the sorption of As(V) ion onto TW biomass was confirmed by negative value of Go (kJ/mol).
  


 


Component Based Software Development Models: A Comparative View

Vol. 6 No. 2

Year: 2011

Issue: Oct-Dec

Title: Component Based Software Development Models: A Comparative View 

Author Name: Ravneet Kaur Grewal, Shivani Goel 

Synopsis: 

Component Based Software Development is a development paradigm by assembling software systems from a number of components. This approach promises high quality, low budget software with shorter time to market. Software development process models used for writing the traditional programs cannot be used for assembling the application using reusable components. The traditional models need new methodologies to support component — based development. Many component based software development models have been proposed by researchers. This paper discusses the advantages and limitations of current models for developing component based systems.

Friday 5 April 2013

Mobility Tracking in Mobile Ad-Hoc Networks on Geological Location

Vol.6 No.1
Year: 2011
Issue: July-September
Title: Mobility Tracking in Mobile Ad-Hoc Networks on Geological Location   
Author Name: Subramani A, Krishnan A   
Synopsis:   
In mobile ad-hoc nodes change position due to the dynamic nature. There has to be a proviso to control the performance and place on standard basis. In this paper, the importance of management plans in ad-hoc networks is studied. Beside this, mobility models are reviewed and rated by the incorporation of real-life applications. It is explored a model for the operation of an ad hoc network and the effect of mobile nodes, where the model incorporates incentives for users to act as transit nodes on multi-hop routes and be pleased with their own ability to send traffic. In this paper, it is explored the implications of the model by simulating a network and illustrates how network resources are allocated for the users based on geological position. Mobile nodes which have incentives to work together also discussed in this paper.  Mobility and traffic pattern of mobility models are generated by using AnSim Simulator.

Trust Enhanced Secure Mobile Ad-hoc Network with Neighbor Collaboration Routing (SMNCR)

Vol.6 No.1
Year: 2011
Issue: July-September
Title: Trust Enhanced Secure Mobile Ad-hoc Network with Neighbor Collaboration Routing (SMNCR)   
Author Name: Subramani A, Krishnan A, Muthusamy   
Synopsis:   
In recent years, many trust and name models are proposed to reinforce the protection of mobile ad hoc networks. However, they either fail to capture proof of trustworthiness at intervals the restrictions of the network, or introduce further issues whereas capturing the proof. In this paper, we have a tendency to propose a reputation-based trust model referred to as Secure MANET with Neighbor Collaboration Routing (SMNCR). In our model, the proof of trustworthiness is captured in an economical manner and from broader views as well as direct interactions with neighbors, observing interactions of neighbors and thru recommendations. SMNCR captures proof from direct interactions with neighbors so as to spot their benign and malicious behaviors. It conjointly captures proof for misbehavior by observing the interactions of neighbors. Lastly, the proof captured from recommendations is employed to summarize the benign behavior of multi-hop nodes. In contrast to alternative models, we have a tendency to adopt a completely unique approach to capture proof from recommendations that eliminates recommender’s bias, free-riding, and honest elicitation. SMNCR utilizes the captured proof to predict whether or not a node is either benign or misbehaving. It then applies the prediction to reinforce the protection of communications looking on the choice policies, like whether or not to send a packet to or forward a packet on behalf of alternative nodes. Finally, we have a tendency to demonstrate the performance of our model through simulation results.

Data stream classification Detection System Using Genetic Algorithm

Vol.6 No.1
Year: 2011
Issue: July-September
Title: Data stream classification Detection System Using Genetic Algorithm   
Author Name: Jeya S, S. Muthu Perumal Pillai   
Synopsis:   
The transmission of data over the Internet increases, the need to protect connected systems also increases. Although the field of IDSs is still developing, the systems that do exist are still not complete, in the sense that they are not able to detect all types of intrusions. Some attacks which are detected by various tools available today cannot be detected by other products, depending on the types and methods that they are built on. Data stream classification (DSC) Detection System Using Genetic Algorithm (GA) is the latest technology used for this purpose. The behavior of the genetic algorithm, a popular approach to search and optimization problems, is known to depend, among other factors, on the fitness function formula, the recombination operator, and the mutation operator. In real-world data stream classification problems, such as intrusion detection, text classification, and fault detection novel classes may appear at any time in the stream. Traditional data stream classification techniques would be unable to detect intrusion until the classification models are trained with Genetic algorithm. We applied this technique in network traffic. The network intrusion detection system should be adaptable to all type of critical situations arise in network. This is helpful for identification of complex anomalous behaviors. This work is focused on the TCP/IP network protocols. Data stream classification pose many challenges, some of which have not been addressed yet.

User Dependent Handwriting Recognition using off-line OCR for Arithmetic Operation and Text Processing

Vol.6 No.1
Year: 2011
Issue: July-September
Title: User Dependent Handwriting Recognition using off-line OCR for Arithmetic Operation and Text Processing   
Author Name: Dwarkoba Gaikwad, Yogesh Gunge , Raghunandan Mundada , Swapnil Patil , Himani Agrawal   
Synopsis:   
This paper proposes a generic font description model for optical character recognition. It is based on concept of evolutionary computing architecture. This is user interface application software which works on learning and recognizing the handwritten text of the particular user. The project allows user to write his command for the computer on blank paper and control the operations of the computer via conversational creature. The purpose is to design an easy interface with the computer for computer illiterate persons. Text written by the user will be available to computer for further processing like text editing, narrating, and messaging. Proposed system has manifold application in government offices for storing lacks of files of record, in business meetings for maintaining review of discussion, in educations for converting professors’ notes into soft copies etc. Blind people can be highly benefited from this system as it supports narrator application. We proposed the algorithm to avoid the ambiguity. The system is tested by giving the handwritten character and sentences of different user. The system recognizes all character and sentences of all users correctly. The user written command on the paper are also recognized and executed by the system. It is found that the performance of the system is approximately is equal to 92%. This system is cost effective as it requires very less hardware support like camera or scanner.

Speech Analysis Technique for Speech Compression

Vol.6 No.1
Year: 2011
Issue: July-September
Title: Speech Analysis Technique for Speech Compression   
Author Name: Hind R. Mohammed, Iman Qays Abduljaleel   
Synopsis:   
This paper presents a comparison between the important transforms using in speech signal such as (Fast Fourier Transform, Discrete Wavelet Transform, Wavelet Packet Transform and Discrete Wavelet-Fast Fourier Transform) and try to compression the coefficients that received from any transforms by using zero's coder before using arithmetic coding  algorithm to compress speech signal. We find the DWT and WPT are the best transform in the speech signal analysis and using the zero's coder increase the compression ratio and using DWT-FFT transform make the compression bad because the coefficients received after using FFT transform are different and make the symbols that entry to arithmetic coding more. Analysis of the compression process was performed by comparing the compressed-decompressed signal against the original.

Self Configurable Dynamic Fuzzy Voter for Safety Critical Systems using Statistical Parameters

Vol.6 No.1
Year: 2011
Issue: July-September
Title: Self Configurable Dynamic Fuzzy Voter for Safety Critical Systems using Statistical Parameters   
Author Name: Phani Kumar, SeethaRamaiah Panchumarthy                                   
Synopsis: 
The main objective of this research paper is designing automatic fuzzy parameter selection based dynamic fuzzy voter for safety critical systems with limited system knowledge. In this research paper, existing fuzzy voters for controlling safety critical systems and fuzzy voters used for sensor fusion are surveyed and the major limitation identified in the existing fuzzy voters, is the static fuzzy parameter selection. Static fuzzy parameters work only for a particular set of data with the known data ranges for which optimized values are selected for fuzzy parameters. These values may not work for other sets of data with different ranges. The static fuzzy parameter selection method may not work for continuously changing different ranges of data. In this paper a dynamic or automatic fuzzy parameter selection method for fuzzy voters is proposed based upon the local set of data in each voting cycle. Fuzzy bandwidth is decided based upon the statistical parameters like mean of the local data set and standard deviation and fuzzy parameters are updated to decide the fuzzy bandwidth in each voting cycle. Safety performance is empirically evaluated by running the static and dynamic fuzzy voters on a simulated Triple Modular Redundant (TMR) system for 10000 voting cycles. Experimental results shows that proposed Dynamic fuzzy voter is giving almost 100% safety if two of the three modules of the TMR System are error free and also giving better safety performance compared to the existing static fuzzy voter for multiple error conditions. Dynamic voter is designed in such a way that it can be just plugged in and used in any safety critical system without having any knowledge regarding the data produced and their ranges, as it processes the data locally in each voting cycle using statistical parameters.

Fuzzy-based TCP Congestion Control Mechanism

Vol.6 No.1
Year: 2011
Issue: July-September
Title: Fuzzy-based TCP Congestion Control Mechanism   
Author Name: Samah A. B. Mustafa   
Synopsis:   
Internet have experienced an explosive growth accompanied with sever congestion problems. TCP congestion control mechanism is crucial for efficient use of the Internet despite largely unpredictable user access patterns and despite resource bottlenecks and limitations. This paper reviews and evaluates the current Reno and Vegas TCP congestion control mechanisms. Generally, the mechanism repeatedly increase load in an effort to find the flow rate that digests the different demands of the end devices, however, each attain different utilization of Internet resource.  Further, the authors present a new controller based on fuzzy concept to overcome the limitations of the current Reno TCP protocol. The fuzzy control in this work is a static algorithm with static rule. All the evaluation data are gained from NS2 simulator.

Wednesday 3 April 2013

Automatic Parameter Selection for Fuzzy Entropic Optimal Threshod to Enhance the Stumpy Foreground Objects in the Image Based On a Fuzziness Measure

Vol. 5 No. 4

Year: 2011

Issue: April-June

Title: Automatic Parameter Selection for Fuzzy Entropic Optimal Threshod to Enhance the Stumpy Foreground Objects in the Image Based On a Fuzziness Measure 

Author Name: Naga Raju C, Pruthvi , Siva Priya 

Synopsis: 

An automatic parameter selection for fuzzy entropic optimal threshold to enhance the stumpy foreground objects in the images based on a fuzziness measure is presented in this paper. This work is improvement of an existing method. Using fuzzy logic, the problems involved in finding the minimum of a criterion function are avoided. Similarity between gray levels is the key to find optimal threshold to initialize the regions of gray levels which are located at the boundaries of the ROI and after that by using an index of fuzziness a similarity process is started to find the thresholding point. The advantages of the proposed method over their conventional counterparts are fourfold. First, the automatic fuzzy parameters for optimal threshold are selected. Second, the ROI is used instead of the whole image, so any irregularity outside the ROI will have no influence on estimating the threshold. Third, it provides a mechanism to handle cost differences of different types of classification error in response to practical requirements. Fourth, is by appropriately specifying the lower a upper bounds of the background proportion within the constrained gray level range, the proposed method yields substantially more robust and more reliable segmentation.

Online E-Book Store Website Design

Vol. 5 No. 4

Year: 2011

Issue: April-June

Title: Online E-Book Store Website Design 

Author Name: Mohammed A. Abdala, Noor Ahmed khider 

Synopsis: 

The ability to perform safe, quick, and efficient, commercial processes, made the idea of selling and purchasing goods online very popular and profitable, especially with the wide spread of internet usage throughout the world. In this paper an e-commerce website for online book sale (e-book store) is developed that offers a collection of scientific and engineering books to choose from. They are arranged in different categories, some are free and others are available for purchase in the store. The site offers a search facility to help find books more efficiently, and a friendly interface to make the buying process easy and simple. The site's administrator is provided with a simple control panel to facilitate the process of adding and removing books from the site's database. The system makes use of a shopping cart system that stores customer orders, and allows each customer to order more than one book at the same time. For validation and simulation purposes, the visa card standard was adopted in the design of a virtual bank database that contains user accounts' information. The database is connected to the system and is used in validating users' requests. The system was implemented using PHP programming language with MySQL relational database on Apache server, and was tested using Internet explorer 7 and Mozilla Firefox.

CPA Based Multi-Robot Path Finding Algorithm for Wireless Multihop Network

Vol. 5 No. 4

Year: 2011

Issue: April-June

Title: CPA Based Multi-Robot Path Finding Algorithm for Wireless Multihop Network 

Author Name: Mohammad Malik Mubeen 

Synopsis: 

The main theme of this project is to apply wireless multi-hop communications to the collaborative path-finding algorithm. Here multiple robots are used in path finding. When the robot is able to communicate with other colleague robots, path finding will be solved collaboratively among the robots. The goal of the communication is to spread the map information as quickly as possible. Every robot shares its map information and the network map information to all other robots simultaneously by broadcasting the information thus reducing the time to find the efficient path from source to destination. For this purpose, cooperative path finding algorithm (CPA) is proposed for a collaborative robot system. This is useful in many applications such as area monitoring where path finding is very important. This project is planned to implement with the help of NS-2.