Saturday 30 March 2013

UML Based Visualization of Objects for Partial Software Reengineering

Vol.5 No.1
Year: 2010
Issue: July - September
Title: UML Based Visualization of Objects for Partial Software Reengineering   
Author Name: Sumesh Sood, Arvind Kalia , Hardeep Singh , Shalini Sood   
Synopsis:   
The law of software entropy dictates that even when a system start to function in a well-designed state, requirement of user changes and user demand new functionality. As the requirements of users change with the passage of time, it becomes mandatory to make changes in software. The new changes may result with the introduction of new bugs in the system. After some time it requires regular maintenance. To find whether software can be maintained, reengineer, or retire and build again, two metrics are proposed in this work. Proposed software reengineering metric (RRC and RRCM) can be used to calculate reengineering requirement cost of entire software and reengineering requirement cost of each Module. UML diagrams are used to identify different components of the software. The results obtained after applying the proposed metric become the basis for the decision whether there will be requirement of maintenance, reengineering or retirement of software.

Performance evaluation and comparison of routing algorithms in zigbee/ ieee 802.15.4

Vol.5 No.1
Year: 2010
Issue: July - September
Title: Performance evaluation and comparison of routing algorithms in zigbee/ ieee 802.15.4   
Author Name: P. Anitha, Chandrasekar C   
Synopsis:   
The real application in wireless Sensor network (WSN) requires connectivity between nodes to transmit the collected data to a sink node. The primary design goal of low rate wireless mesh personal area networks (LR-WMPANs) is low cost, low power consumption and support of simple devices such as sensors and actuators. Zigbee is a standard for wireless personal area network (WPAN) based on IEEE 802.15.4. It is a new standard uniquely designed for low rate wireless personal area networks. The main research objectives of this paper are to evaluate the adequateness of the IEEE 802.15.4 protocols for supporting WSN applications and the identification of open issues in the standard specifications. Zigbee uses Ad-hoc On-demand Distance Vector (AODV) as a routing protocol.  Finally, this paper presents a performance of the Ad-hoc On-demand Multipath Distance Vector (AOMDV) as a routing protocol, comparing experimental and simulation results. Performance comparison of AOMDV with AODV using ns-2 simulations shows that AOMDV is able to effectively cope with mobility-induced route failures.

Approach for Analyzing Clustering Technique in Software Maintenance for Object Oriented System

Vol.5 No.1
Year: 2010
Issue: July - September
Title: Approach for Analyzing Clustering Technique in Software Maintenance for Object Oriented System 
Author Name: Dr. Anil Kumar Malviya, N. Badal   
Synopsis:   
Object Oriented Software Engineering has been emerging field for software development process. Although maintenance may be turn out to be easier for Object Oriented System. But it is unlikely that the maintenance burden will be completely disappearing. Still, maintenance consumes a large portion of software development cost. Therefore it is worthwhile to develop Object Oriented System keeping maintainability as a key issue in design phase. This paper examines the role of clustering technique of data mining in maintenance of software system using object oriented metrics. The presented work evaluates the K-means clustering method by applying it to the commercial software system. The experimental work of software maintenance for the sample data is being simulated on Matlab.


New secure communication protocols for mobile e-health system

Vol.5 No.1
Year: 2010
Issue: July - September
Title: New secure communication protocols for mobile e-health system   
Author Name: Aramudhan M, K. Mohan   
Synopsis:   
E-health system has been used as communication system that enables to deliver medical service over the Internet. The medical information’s accessible in e-health systems are highly sensitive and distributed that demands strong authentication and authorization mechanisms for communication between the healthcare professionals, consumers and providers. Internet is an open system that provides anybody can access any information. Hence, e-health users demand safe communication and user privacy over Internet. This paper introduces two secure communication protocols based on message passing and Mobile agent for online e-health system. Certificate based Authentication and Attribute based Policy assigned Authorization framework (CAAPA) for Mobile e-health systems is proposed based on message passing technique. Token based Cross Verification (TCV) protocol is proposed for secure services in e-health System over Internet based on Mobile agent. Both protocols offer user friendly, well-built secure mechanism that gives confident to the users and healthcare professional to access the E-health system. CAAPA is efficient in terms of maintaining strong user privacy and the communication overhead is high. TCV is efficient in terms of maintaining strong user privacy and consumes less communication overhead.


Mathematical Modeling and Simulation of Computer Programming

Vol.5 No.1
Year: 2010
Issue: July - September
Title: Mathematical Modeling and Simulation of Computer Programming   
Author Name: S. Sankar, Yuvaraj   
Synopsis:   
This paper examines the application of Visual Basic Computer Programming Language to Simulate Numerical Iterations, the merit of Visual Basic as a Programming Language and the difficulties faced when solving numerical iterations analytically, this research paper encourage the uses of Computer Programming methods for the execution of numerical iterations and finally fashion out and develop a reliable solution using Visual Basic package to write a program for some selected iteration problems


Random segmentation blocks algorithm for Gout Skin Detection and recognition

Vol.5 No.1
Year: 2010
Issue: July - September
Title: Random segmentation blocks algorithm for Gout Skin Detection and recognition   
Author Name:   
Synopsis:   
Gout is a disease of antiquity but is increasing once again in prevalence despite availability of reasonably effective treatments. This may be related to a combination of factors, including diet, obesity, and diuretic use. Allergic reactions, noncompliance, drug interactions, and sometimes inefficacy all limit the effective use of current hypouricemic agents. The objective of this paper is to show  that for every color space there exists an optimum Gout skin detector scheme so that the performance of all these skin detectors schemes is the same, and then process the Random segmentation blocks  algorithm in order to recognition Gout skin  A theoretical proof is provided and experiments are presented which show that the separability of the skin and no skin classes is independence of color space and some parameters chosen (experimentation  for 80 gout image for different types and 80 other Dermatological disorders images) for testing are Energy, Entropy,  Average and  Variance. 160 patients were randomly placed in three groups and treated topically along 7-weeks with either gout in foot or hand or other parts body. The recognition results for testing program by Random segmentation blocks algorithm shows superior efficacy for gout skin detection (the testing stage contain all 160 images to recognized only gouts images).
.


Improving Performance of Voice over IEEE 802.11e

Vol.5 No.1
Year: 2010
Issue: July - September
Title: Improving Performance of Voice over IEEE 802.11e   
Author Name: Chenna Reddy P   
Synopsis:   
Quality of service is how well the network satisfies the user requirements. IEEE 802.11e is the quality of service extension of wireless LAN standard IEEE 802.11. IEEE 802.11e achieves quality of service by service differentiation using the parameters Arbitration interframe space and contention window. They are set to different initial values for different flows to achieve service differentiation. But the procedure used to change contention window is same for all the flows. In this paper the contention window is adjusted differently for different flows to improve the performance of voice. Network simulator, NS2, is used for simulation.


Effects of Self-frequency Shift on Soliton Propagation

Vol.5 No.1
Year: 2010
Issue: July - September
Title: Effects of Self-frequency Shift on Soliton Propagation   
Author Name: Dowluru Ravi Kumar DOWLURU, B. Prabhakara Rao   
Synopsis:   
Though the solitons are more robust than linear transmission systems, the effect of soliton self-frequency shift reduces the stability of the soliton transmission system. In this paper we investigated the propagation of ultra-short solitons in fiber-optic systems in the presence of the soliton self-frequency shift effect. We also demonstrated that both the self-frequency shift and the background instability can be effectively controlled using spectral filters of moderate strength together with nonlinear gain devices with gain proportional to the second and fourth power of the amplitude.

A new greedy algorithm for multi-processor scheduling with gang scheduling

Vol.5 No.1
Year: 2010
Issue: July - September
Title: A new greedy algorithm for multi-processor scheduling with gang scheduling   
Author Name: Sarath B. Siyambalapitiya, M. Sandirigama   
Synopsis:   
In this study, we propose some greedy algorithms for the multi-processor job scheduling problem. A given list of jobs are arranged according to the time duration for processing. Depending on the job processing times, some jobs are divided into multi-threads while others remain as single thread jobs. Multi-thread jobs are processed based on the concept of gang scheduling. A lower bound for the total processing time is computed. The results of the proposed algorithm is presented using a percentage gap from the optimal solution  using this lower bound.

Adaptive Resource Management in Application to Surveillance Networks using Stochastic Methodologies

Vol.5 No.1
Year: 2010
Issue: July - September
Title: Adaptive Resource Management in Application to Surveillance Networks using Stochastic Methodologies   
Author Name: Charles C. Castello, Jeffrey Fan   
Synopsis:   
A wide range of applications have been developed in recent years pertaining to surveillance networks, which include defense, environmental protection, manufacturing, weather forecasting, and structural monitoring. The following research aims to present a novel method of resource allocation in surveillance networks (e.g. Wireless Sensor Networks) by using stochastic modeling techniques and reconfigurable System-on-a-Chip (SoC) systems. The basic idea behind the proposed framework is that a set amount of system resources (e.g. processing power, transmission bandwidth, system memory, etc.) can be dynamically allocated to different nodes within the system depending on the application or needs at any given time. The allocation of these resources is based on a stochastic approach which models resource demands by utilizing known and unknown random distributions. These distributions are analyzed using their associated polynomial expansions for known distributions and importance sampling for unknown distributions. An example of this would be using the Hermite Polynomial Chaos (PC) representation of random processes for Gaussian and log-normal distributions. The proposed framework results in intelligent surveillance networks with the ability to allocate resources in real-time.

Thursday 28 March 2013

A Quantization based Watermarking Scheme for Image Authentication

Vol.4 No.4
Year: 2010
Issue: April-June
Title: A Quantization based Watermarking Scheme for Image Authentication   
Author Name: Manisha Sharma, M. Kowar   
Synopsis:   
A watermarking technique for image authentication inserts hidden data into an image in order to detect any accidental or malicious alteration. In the proposed work, a watermark in the form of a visually meaningful binary pattern is embedded for the purpose of tamper detection. The watermark is embedded in the discrete cosine transform coefficients of the image by quantizing the coefficients proportionally according to the watermarked bit. Experimental results demonstrate the performance and effectiveness of the scheme for image authentication by maintaining good values for both, the Peak Signal to Noise Ratio (PSNR) and Tamper Assessment Function (TAF).

Improvement in Spectral Output and Computational Efficiency of Digital Filter Bank

Vol.4 No.4
Year: 2010
Issue: April-June
Title: Improvement in Spectral Output and Computational Efficiency of Digital Filter Bank   
Author Name: Ganekanti Hemanj, K. Satya Prasad , P. Venkata Subbaiah  
Synopsis:   
Digital filtering is considered to be crucial operator in reconstruction and visualization of information, besides amounting to increase in computational efficiency. The FIR based digital filter bank is more effective in respect of above advantages. In this paper, we propose a novel approach in Multirate technique through Digital filter bank method, based on Modified Kaiser window. A remarkable spectral output is achieved by way of increase in magnitude, quality of output, better frequency response and computational efficiency. The simulation results are added on account of satisfactory performance and comparison is drawn to enlighten the advantages in the proposed method. This type of filter bank is particularly suitable in typical hearing aid applications to achieve significant merits for improving output quality.

To Cope With Misbehavior in Mobile

Vol.4 No.4
Year: 2010
Issue: April-June
Title: To Cope With Misbehavior in Mobile   
Author Name: V. Sumalatha, Prasanthi   
Synopsis:   
Ad-Hoc wireless networks have emerged as one of the key growth areas for wireless networking and computing technology. One of the major factors effecting the ad-hoc communication is the misbehaving of nodes. Node misbehavior due to selfish or maliciousness or faults can significantly degrade the performance of mobile ad hoc networks. Most of the routing protocols in wireless ad hoc networks, such as DSR, fail to detect misbehavior and assume nodes are trustworthy and cooperative. Dynamic Source Routing protocol is modified to cope with misbehavior. It enables nodes to detect misbehavior by first-hand observation and use the second-hand information provided by other nodes. The view a node has about the behavior of another node is captured in a reputation system, which is used to classify nodes as misbehaving or normal. Once a misbehaving node is detected, it is isolated from the network. Reputation systems can, however be tricked by the spread of false reputation ratings, be it false accusations or false praise. To solve this problem, a fully distributed reputation system is proposed that can cope with false information and effectively use second-hand information in a safe way. Approach is based on modified Bayesian estimation and classification procedure for the isolation of malicious and selfish nodes of given network. In this paper, tests are performed for the network containing normal nodes and misbehaving nodes, the delay plots of original DSR and Modified DSR is compared and performance has been analyzed. The proposed task is implemented in MATLAB for protocol implementation and verification.


Performance Analysis of Hamming Code for Fault Tolerant 8-bit data bus in VDSM technology

Vol.4 No.4
Year: 2010
Issue: April-June
Title: Performance Analysis of Hamming Code for Fault Tolerant 8-bit data bus in VDSM technology   
Author Name: Sathish A, M. Chennakesavulu , M. Madhavi Latha , K. Lal Kishore   
Synopsis:   
In Very Deep-submicron (VDSM) systems, the scaling of ULSI ICs has increased the sensitivity of CMOS technology to cause various noise mechanisms such as power supply noise, crosstalk noise, leakage noise, etc. In VDSM technology distance between the data bus lines is reduced, so coupling capacitance is dominating factor. The coupling capacitance (CC) is between long parallel wires. The load capacitance (CL) defines the wire-to-substrate capacitance. Unfortunately, in VDSM systems, the coupling capacitance is of magnitude several times larger than the loading capacitance. The coupling capacitance causes logical malfunction, delay faults, and power consumption on long on-chip data buses. An important effect of the coupling capacitance is Cross talk. Crosstalk is mainly dependent on several factors: drive strength, wire length/spacing, edge rate and propagation duration. The crosstalk noise produces from the coupling capacitance. Such faults may affect data on data bus. The severity of this problem depends on fault duration. To avoid this condition and to guarantee signal integrity on the on-chip communication, a fault tolerant bus can be adopted. This could be achieved by implementing error-correcting codes (ECCs), providing on-line correction and do not require data retransmission. The 8-bit data bus is implemented in 1200nm, 180nm, 120nm, 90nm and 65nm technologies and simulation results shows that crosstalk increases as the technology scales down. For reliable transmission of the data ECC techniques is placed on the data bus. We employed a Hamming code as ECC for 8-bit fault tolerant data bus. This is implemented in 1200nm, 180nm, 120nm, 90nm and 65nm technology. The simulation results show that the average Power varies from 48.054mW to 0.235m, and Maximum delay varies from 3.437ns to 0.092ns respectively.

Efficient Greedy Algorithm for Multi-Processor Scheduling

Vol.4 No.4
Year: 2010
Issue: April-June
Title: Efficient Greedy Algorithm for Multi-Processor Scheduling   
Author Name: Ruwanthini Siyambalapitiya, M. Sandirigama   
Synopsis:   
In this study, we propose some simple greedy algorithms for the multi-processor job scheduling problem. Given list of jobs are arranged according to the time duration for processing. The results of the proposed algorithms are compared with the first-come first-serve (FCFS) job scheduling approach and shown to be superior

Defending Against Stealthy Botnets

Vol.4 No.4
Year: 2010
Issue: April-June
Title: Defending Against Stealthy Botnets   
Author Name: Bagath Basha, N. Sankar Ram , Paul Rodrigues , Ranjith   
Synopsis:   
Global Internet threats are rapidly evolving from attacks designed solely to disable infrastructure to those that also target people and organizations. This alarming new class of attacks directly impacts the day to-day lives of millions of people and endangers businesses and governments around the world. For example, computer users are assailed with spyware that snoops on confidential information, spam that floods email accounts, and phishing scams that steal identities. At the center of many of these attacks is a large pool of compromised computers located in homes, schools, businesses, and governments around the world. In this paper, the authors provide a detailed overview of current Botnet technology and defense by exploring the intersection between existing Botnet research, the evolution of botnets themselves, and the goals and perspectives of various types of networks. Authors also describe the invariant nature of their behavior in various phases, how different kinds of networks have access to different types of visibility and its strong impact on the effectiveness of any Botnet detection mechanism. A comprehensive picture of the various Botnet detection techniques that have been proposed is provided. Finally, the paper summarizes the survey and suggests future directions.

Predictive Analysis in VLSI Design

Vol.4 No.4
Year: 2010
Issue: April-June
Title: Predictive Analysis in VLSI Design   
Author Name: Tom Page, Gisli Thorsteinsson   
Synopsis:   
Advances in silicon technology have made possible the design of very large scale integration (VLSI) chips. The size of these designs has reached a level where the design activity is carried out by a team of designers rather than an individual. The complexity of the designs and the merging of functional islands from different designers can lead to mistakes (bugs) in the chip. This paper details the methods used by the Subsystems Electronics hardware design group (based at IBM Havant) to minimise the possibility of releasing a chip containing bugs.  In most cases the design will have cost and schedule constraints, and there is a trade-off between the amount of time and exhibit expended at the design and simulation phases, and the risk of sending a chip for fabrication before all the bugs have been found. The problem of determining when a chip should be released for fabrication has been addressed by the use of statistical analysis to assess when the simulation is complete or no longer likely to find mistakes.


Monte Carlo Simulation for Reliability Assessment of Component Based Software Systems

Vol.4 No.4
Year: 2010
Issue: April-June
Title: Monte Carlo Simulation for Reliability Assessment of Component Based Software Systems   
Author Name: Chinnaiyan R, S. Somasundaram                 
Synopsis:   
Reliability assessment of component based software system plays a vital role in developing quality software systems using component based development methodology. This can be achieved by Monte Carlo Simulation Method of reliability prediction when software system complexity makes the formulation of exact models essentially impossible. The characteristics of the Monte Carlo method make it ideal for estimating the reliability of component based software systems. Unlike many other mathematical models, software system complexity is irrelevant to the method. Not only can the structure of the software system be dynamic, but the precise structure of the software system need not even be known. Instead, software system components need only be tested for failure during operation, which ensures that software components which are used more often contribute proportionally more to the overall software system reliability estimate. This paper presents a novel Monte Carlo Simulation method for assessing the reliability of component based software systems.

H.264 Based Architecture of Digital Surveillance Network in Application to Computer Visualization

Vol.4 No.4
Year: 2010
Issue: April-June
Title: H.264 Based Architecture of Digital Surveillance Network in Application to Computer Visualization   
Author Name: Wei Zhao, Raul Batista , Jeffrey Fan, Jichang Tan   
Synopsis:   
The majority of today’s video surveillance systems are analog based, requiring human interaction and are susceptible to multiple threats. Utilizing a fully digitized system incorporating the Vector Bank (VB), Laplacian of Gaussian (LoG), and Directional Discrete Cosine Transform (DDCT) to each surveillance camera in a surveillance network, the need for digital processing would be accomplished at each individual camera level, reducing the memory required at the network hub and providing a decentralized secure surveillance network. Detection and tracking of moving objects may be sensed by each independent surveillance camera in the network, processed using VB, LoG, and DDCT to drastically reduce bandwidth, and computation time of processing, ultimately reducing data bus traffic and transmission bandwidth. In this paper, the addition of a hardware based System on Chip (SoC) digital signal processing algorithms interfaced to H.264 architecture for the purpose of a fully redundant and autonomous multi-camera digital video surveillance network is discussed. Experimental results show the performance and its capability to detect, isolate and track the objects, while reducing computational time, channel bandwidth and overhead costs of the digital video surveillance network while increasing security. Such achievements prove to greatly improve computer visualization, and performance of the network.


Broad and Robotic Simulation Modeling Based On Measurements

Vol.4 No.4
Year: 2010
Issue: April-June
Title: Broad and Robotic Simulation Modeling Based On Measurements   
Author Name: S. Sankar, G. Gokula Krishnan   
Synopsis:   
SoFT is a new method and tool which measures and models linear electrical network components in a wide frequency band with unprecedented accuracy. This is achieved by a special modal based measurement technique in combination with suitable rational fitting and passivity enforcement methods. The models are easily imported into most commonly used simulation software. This paper demonstrates the SoFT tool computations in a comparison between A)   time-domain   measurements   of   a lightning impulse test of a power transformer, B) simulation of the test results using a SoFT model, and C) simulation of the test results using a lumped-element circuit simulation model based on geometrical transformer design information.


A Multi-Agent Hierarchical Fuzzy Signatures Approach To Optimize A Quality Management System

Vol.4 No.4
Year: 2010
Issue: April-June
Title: A Multi-Agent Hierarchical Fuzzy Signatures Approach To Optimize A Quality Management System   
Author Name: Hajer Ben Mahmoud, Raouf Ketata, Taieb Ben Romdhane, Samir Ben Ahmed   
Synopsis:   
This paper proposes a support system for quality management of a company. Moreover, the reproduction of such Quality Management System (QMS) requires both modeling and piloting. For modeling, a Multi-Agent System (MAS) approach using a micro framework between the agents based on the sequence diagram UML (Unified Modeling Language) was proposed. For piloting, the proposed method is the Hierarchical Fuzzy Signatures (HFS) that is one of the best methods adopted for this kind of problems. To validate this developed system, an industrial company which presents a major problem for controlling the quality level of production lines was chosen. The achieved results have shown that HFS concept is effective, efficient and flexible for piloting such model QMS-MAS.

Wednesday 27 March 2013

Color Image Restoration for an Effective Stegaography

Vol.4 No.3
Year: 2010
Issue: January - March
Title: Color Image Restoration for an Effective Stegaography                            
Author Name: Dwarkoba Gaikwad, S.J. Wagh   
Synopsis:   
The word steganography comes from the Greek Steganos, which mean covered or secret and graphy mean writing or drawing. There are many techniques in steganography such as least significant bit insertion (LSB), masking and filtering, and transformation techniques. The LSB technique is common, which randomly select the pixels of the cover-object that is used to hide the secret message. It is possible to combine the techniques by encrypting message using cryptography and then hides the encrypted message using steganography. The resulting stego-image can be transmitted without revealing that secret information is being exchanged. Furthermore, even if an attacker were to defeat the steganographic technique and detect the message from the stego-object, he would still require the cryptographic decoding key to decipher the encrypted message. In this paper we proposed the image restoration technique in Stenography. In this paper we blur image before hiding the message image using special Point Spread Function and randomly generated key. We have used two input key values as parameters of PSP. At the time of blurring the image, the different new third key value is generated for different message images. These keys are used at the time of the deblurring of  the message image which are kept secrete. For recovering the real message image we need to get image from stego image and then restore it. It is very difficult to restore message image without knowing PSF and third key value. We found, this technique is less time consuming, simple and robust.

A Seamless Handover scheme for wireless Network using SIGMA

Vol.4 No.3
Year: 2010
Issue: January - March
Title: A Seamless Handover scheme for wireless Network using SIGMA   
Author Name: B. Jaiganesh, R. Ramachandran   
Synopsis:   
Wireless etworks ecome more widely used to support advanced services. A researcher has been interested in having Internet connectivity in space for quite some time. This would allow scientists with direct Internet access to data and  devices on the satellites. The rotation of Low Earth Orbiting (LEO)  satellites around the Earth result in handover  of satellites  between  ground  stations. Two types of handover can be observed in space: Link layer and Network layer. Researchers have been developing a Seamless IP diversity-based Generalized Mobility Architecture (SIGMA) to ensure smooth handovers of end to end connections between nodes on Earth and satellites. In this paper, we provide a survey of  the various types of handovers in the space environment, followed by simulation results of SIGMA handover performance in a space environment.

On the Signature Analysis of Analog-to-Digital Converters

Vol.4 No.3
Year: 2010
Issue: January - March
Title: On the Signature Analysis of Analog-to-Digital Converters   
Author Name: Vadim Geurkov, L. Kirischian   
Synopsis: 
Analog-to-digital converters (ADCs) have been an essential part of many systems employed in mission critical applications. Their fault tolerance has been an increasingly important issue. The notion of fault tolerance includes fault detection (or testing). Automatic test equipment (ATE) has been extensively used to perform sophisticated testing of ADCs. However, ATE can not be utilized in the field due to extremely high cost. In addition, the bandwidth of the ATE is normally lower than the bandwidth of the ADC being tested, which makes it difficult to accomplish at-speed testing requirements. It is important, therefore, to embed test hardware into ADC itself. The methods employed at ATE are complex and inconvenient for built-in realization. More advantageous are the methods exploiting accumulation of output responses. The size of the accumulator depends on the number of responses. In order to achieve greater fault coverage, this number is kept large, complicating the implementation. On the other hand, signature analysis used in digital systems testing is well suited for compaction of “lengthy” responses, and it is characterized by small hardware overhead and low aliasing probability. In this work, we apply a signature analysis principle for the compaction of output responses of an ADC. The permissible tolerance bounds for a fault-free ADC are determined, and the aliasing rate is estimated. Examples are given.

Association Rule Mining with Neural Network for Brain Tumor Classification

Vol.4 No.3
Year: 2010
Issue: January - March
Title: Association Rule Mining with Neural Network for Brain Tumor Classification   
Author Name: P. Rajendran, M. Madheswaran   
Synopsis:   
In the recent past, the development of computer aided diagnosis systems has been prepared to assisting the physicians for better decision making. This has motivated the research in creating vast amount of image database in the hospitals and health care centers. It has been reported that the brain tumor is one of the major causes leading to higher incidence of death in human. Physicians face challenging task in extracting the features and decision making form. The computerized tomography (CT), which is found to be the most reliable method for early detection of tumors. Due to the high volume of CT images to be used by the physicians, the accuracy of decision making tends to decrease. This has further increased the demand to improve the automatic digital reading for decision making. This paper proposes the tumor detection in brain images. The authors investigate the use of different data mining techniques namely, neural network and association rule mining for anomaly detection and classification. The method proposed makes use of association rule mining technique to classify the CT scan brain images into three categories namely normal, benign and malignant. It combines the low-level features extracted from images and high level knowledge from specialists. The developed algorithm can assist the physicians for efficient classification with multiple keywords per image to improve the accuracy. The results show that the classification accuracy has been obtained as 75% for the classifier using association rule and 70% for neural network classifier, making it a suitable scheme for image mining applications.

Analysis of Inductive Effects of On-chip and On-board Return Current Path for Performance Degradation Modeling in VLSI design

Vol.4 No.3
Year: 2010
Issue: January - March
Title: Analysis of Inductive Effects of On-chip and On-board Return Current Path for Performance Degradation Modeling in VLSI design   
Author Name: Sourabh Sthapak, Jeffrey Fan   
Synopsis:   
As the dimension of interconnects in Integrated circuits has become dominant, the inductive effects of the wires cannot be ignored anymore. At high frequency, the return current distributes itself close to the signal path and any increase in the inductance of the return path hampers signal integrity. The multi-layered power distribution network (PDN) is stressed when many devices draw current simultaneously, creating noise in the supply rails. This high speed current not only causes ground bounce and power supply sag but it also needs a low inductance return path. Since high frequency involved in contemporary signaling makes the interconnects to behave as lossy transmission lines, the chip may sustain less noise margin due to environmental or process variations. If the inductive effort is considered, the design will be more robust and variability-free, thus improving the defect tolerance. In this paper, a SPICE based analysis of “on-board” high speed return current path is conducted and techniques and results are then extended for “on-chip” return current path analysis. In addition to avoiding operational failures, a priori knowledge of signal and its respective return path would greatly help to simplify interconnect designing and routing.

Local feature descriptive modeling for natural images for image retrieval system

Vol.4 No.3
Year: 2010
Issue: January - March
Title: Local feature descriptive modeling for natural images for image retrieval system   
Author Name: P.V.N. Reddy, K. Satya Prasad   
Synopsis:   
In this paper a retrieval study for natural image learning environment is proposed.  The content of this paper is a result of projects called Image content-based retrieval on a natural image database. The objective of this project is to develop an image content-based search engine, which can perform identity check of a natural image. It is well known that conventional natural image databases can only be retrieved by text-based query. In this paper we use the shape, color, and other features extracted from a captured natural image to search the natural image database. The developed technique is able to perform scale, translation, and rotation invariant matching between natural images. Currently, the database contains several hundreds of natural images. In future, we shall enhance the capability of the search engine to deal with more than 30,000 natural image species, which is the total amount of natural image species along the coast.

Discrete Wavelet Transform with Enhancement Filter for ECG Signal

Vol.4 No.3
Year: 2010
Issue: January - March
Title: Discrete Wavelet Transform with Enhancement Filter for ECG Signal   
Author Name: Khaled Daqrouq, Abd Alrahman Qawasmi , Milhled Alfaouri  
Synopsis:   
ECG is a very important tool for the primary diagnosis of heart disease;   it gives a full picture about the electrophysiology of the heart and the ischemic changes that may occur like the myocardial infarction, conduction defects and arrhythmia. For this reason ECG signal must be clearly represented and filtered to remove all noise and artifacts from the signal. ECG is one of the biosignal that is considered a non-stationary signal and needs a hard work in denoising. In this paper a new approach  of denoising of EGG signal is proposed in using Wavelet Transform WT. Different ECG signals are used to evaluate the method using MATLAB’ software. The presented method showed better results than conventional methods particularly in ECG signal case.



Fault Tolerance and Reliability of Mesh Multi-computer Networks – A Review

Vol.4 No.3
Year: 2010
Issue: January - March
Title: Fault Tolerance and Reliability of Mesh Multi-computer Networks – A Review   
Author Name: Mostafa Abd-el-barr   
Synopsis:   
Multi-computer systems (MCSs) are efficient in solving computing-intensive problems. A MCS consists of a number of processing elements (PEs) and an interconnection network (IN). A fault in any PE and/or the IN can lead to losses in sensitive data and/or the overall throughput. In this paper, we provide a review of the fault tolerance and reliability assessing techniques of mesh MCSs. A number of fault-tolerant routing techniques are analyzed including the dimension ordering, the turn model, and the block-fault model. In reliability analysis, we consider the sub-mesh reliability exact and approximate models and the task-based reliability computation.  It is expected that some of the techniques and algorithms covered in this paper can have applications in the domain of wireless mesh networks.

An Intelligent Service System Based On Soundex and Multi-Layer Neural Network Algorithms

Vol.4 No.3
Year: 2010
Issue: January - March
Title: An Intelligent Service System Based On Soundex and Multi-Layer Neural Network Algorithms   
Author Name: Lubna Badri, Bilal Deya , Rami Zuhdi   
Synopsis:   
The aim of this research is to develop an intelligent service system to receive, manage, and serve text messages sent by system users asking for files stored in a system database. The proposed system needs to understand the received text message and takes the decision about the required files. The system handles files and a corresponding computer program and a corresponding computer-readable storage, which can be used for handling e-mail attachments and sent the requested files to the user email. To improve the designed text message recognition system, the combination of two methods have been used, a supervised learning multilayer neural networks trained with back propagation and Soundex algorithms. Experimental results show that the combination of two classification techniques to recognize and manipulate the received text message improves the classification accuracy.

Tuesday 26 March 2013

Character Analysis using Matra Segmentation Algorithms for Distorted Tamil Characters

Vol.4 No.2
Year: 2009
Issue: October - December
Title: Character Analysis using Matra Segmentation Algorithms for Distorted Tamil Characters   
Author Name: R. Indra Gandhi, K. Iyakutti   
Synopsis:   
Segmentation is an important phase towards designing an optical character recognition system. Most of the segmentation algorithms primarily aim at segmenting text, graphics, page, line and word. It is a critical step as most recognition errors occur due to incorrect segmentation of characters. Character segmentation is the fundamental process in character recognition approaches, which rely on isolated characters. The accuracy of the text recognition system heavily depends on character segmentation.  All the techniques that already exist do not work well when the document contains distorted characters. Special care on “Matra” is needed to segment distorted characters. In this paper, we have empirically implemented algorithms for solving the key problems of distorted characters segmentation. Experimental results show that the proposed technique is accurate, easy for extension, and may be very effective for non-headline based complex Indic scripts.



A Technique to Reduce Data-Bus Coupling Transitions in DSM Technology

Vol.4 No.2
Year: 2009
Issue: October - December
Title: A Technique to Reduce Data-Bus Coupling Transitions in DSM Technology   
Author Name: Sathish A, M. Madhavi Latha , K. Lal Kishore , M.V. Subramanyam , C.S. Reddy   
Synopsis:   
With growing integration density and shrinking feature size in the deep sub-micrometer (DSM) technologies, on-chip buses plays an important role in overall performance of the system. Due to a large buses and deep sub-micron effects where coupling capacitance between bus lines are in the same order of magnitude as base capacitance, power consumption of interconnects starts to have a significant impact on a system’s total power consumption. In many digital processors, the power dissipation on the buses is a major part of the total chip power dissipation. For CMOS circuits most power is dissipated as a dynamic power for charging and discharging node capacitances. Coupling transitions contribute to significant energy loss in deep sub-micron data buses. Earlier schemes using the switching activity are not valid in these buses which takes account only substrate capacitance. Hence a new low coupling transition bus encoding scheme is proposed which can reduce the power consumption in   on-chip data buses by reducing the coupling transitions. The proposed technique can able to reduce the coupling transition by 41% to 44% and its efficiency is 1% to 18% more compare with others encoding techniques.


A Case Study Analysis of Software Architecture Using Innovative Patterns

Vol.4 No.2
Year: 2009
Issue: October - December
Title: A Case Study Analysis of Software Architecture Using Innovative Patterns   
Author Name: N. Sankar Ram, Paul Rodrigues , Omar A. Alheyasat , Subramanyam Arige   
Synopsis:   
In today’s world of rapidly advancing technology, guaranteeing software quality is of paramount importance. The quality of software is intricately connected to the underlying architecture. It is recognized that it is not possible to measure the quality attributes of the final system based on the software architecture design alone, because the software architecture of a system is defined as the “meta-structure“, “which comprise software components, the externally visible properties of those components, and the relationship among them”. This definition focuses only on the internal aspects of the system. In the current work, analysis models such as SAAM (Software Architecture Analysis Method), SAAMCS (Software Architecture Analysis Method Founded on complex Scenario), ESAAMI (Extending SAAM by Integration in the Domain), SAAMER (Software Architecture Analysis Method for Evoluation and Reusability), ATAM (Architecture Trade-Off Analysis Method), SBAR (Scenario-Based Architecture Reengineering), ALPSM (Architecture Level Prediction of Software Maintenance), and SAEM (Software Architecture Evaluation Model) were considered. The Architecture Trade-Off Analysis Method (ATAM) was found the best among them for evaluating the software quality attributes. But, it suffers from inherent drawbacks in terms of cost, quality attributes and business focus. A software architecture analysis model is being presented which will overcome the drawbacks in ATAM by using various innovative patterns like Subtraction pattern, Multiplication pattern, Division pattern, Task Unification pattern and Attribute dependency change pattern. The various innovative patterns were applied to a case study and their results are discussed.