Anyone, try this, send them any sort of stupid paper, the journal will accept and will ask you to give MONEY.
Papers published by this journal have no value. The journal is already blacklisted by many universities.
Journal of Computer Science and Engineering
LIST OF UNREFEREED PAPERS, NO REVIEW, WASTE PUBLICATIONS
An Efficient Sampling Algorithm on Data Streams to Improve Closed Frequent Itemsets Mining Algorithms Results [Full Text]
Mohammad Saniee Abadeh and Mansour Tarafdar
Sampling is one of the effective methods in data reduction especially as a preprocessing task for data streams. Regard to the features of data streams, it is essential to introduce a lossless-information sampling algorithm. Despite on several Closed Frequent Itemset mining algorithms presented since now for non-redundant itemsets discovery over transactional data streams, we can improve performance of these algorithms using sampling task.In this paper, we introduce a Closed Frequent Itemset mining based on reservoir sampling algorithm called CFISDS1 in landmark window model of transactional data stream.
This algorithm initiate a concept lattice then, processes the transactions arriving in the stream and updates the concept lattice appropriately in batch form. This incremental update preserve more relevant transaction to closed frequent itemset mining task in reservoir space and extract this sample on demand of user. Experimental evaluations over synthetic datasets and real life datasets demonstrate the scalability of the algorithm. In addition, presented measure shows the efficiency of CFISDS compare to the other transactional data stream sampling algorithms.
Facing the Duplicates in a Greek Hospital Information System [Full Text]
Medical informatics has been growing rapidly over the past decades. Due to the advances in new applications in health and wel-fare, vast amounts of patient data are generated every day, and thus critical medical information such as patient records, need to be effec-tively organized and analyzed. Duplicates patient data in Hospital Information Systems (HIS), implies a problematic management of clinical information and prevent the patient data sharing even among healthcare professionals of the same hospital. This paper explores the reasons for duplicates within the HIS and aims to identify them through both deterministic and probabilistic search methods. Certainty levels for matching criteria are specified according to matching weights that are based on several studies that have investigated the matching algo-rithms and on empirical inspection. Query design algorithms are analyzed for both methods and finally a partial linking of patient data has been achieved, through clerical reviews in order to examine whether the possible matches are true. Our research is still in evolution and there is much work to be done to completely and correctly consolidate patient data.
Implementation of Approximate Reasoning Techniques using Vague Logic [Full Text]
Supriya Raheja, Reena Dadhich and Smita Rajpal
Approximate Reasoning using fuzzy logic provides a realistic framework for human reasoning. The concept of vague logic introduced by Gau & Buehrer is the higher order fuzzy logic. Our present work is based on the concept of vague Logic. In this paper we are defining the approximate reasoning implication rules Generalized Modus Ponens (GMP) and Generalized Modus Tollens (GMT) using vague logic. As a special case we are also reducing the concept of GMT and GMP using fuzzy logic with the help of example.
A Hybrid of Simulated Annealing and Fuzzy Data Mining Approach for Concept Extraction from Databases [Full Text]
Ayoub Bagheri, Maryam Zekri and Mohammad-A. Rigi
In the recent decades huge amounts of data have been collected and stored, due to the fact that a lot of useful information is hidden in the data, the companies need tools to find out patterns and regularities. As many of the databases are very large containing a huge number of tuples and attributes, efficient automated tools are necessary for acquiring useful information. In this paper an approach is proposed for extracting useful information from a database using a hybrid of simulated annealing and Fuzzy data mining algorithm. The simulated annealing algorithm is employed to find a set of useful fuzzy
concepts with a good fuzzy support for the output of fuzzy data mining process. Some experiments have demonstrated that our method generates rules of better performance than the common fuzzy grid-based data mining approach.
Spoofing – A threat to Biometric Systems [Full Text]
Kezia R Badhiti and Prof.Sudha Thatimakula
Biometrics is a life measurement mainly used for authentication of the people by identifying and verifying their physiological and behavioral characteristics. False Match Rate (FMR) and False Non-match rate (FNMR) are the two accuracy measures of Biometric Systems. A Biometric solution False Match Rate is the probability that users template will be incorrectly judge to be a match for a different user’s template. A Biometric solution False Non-match Rate is the probability that users template will be incorrectly judge to not match his or her enrollement template. Human Biometric Credentials like Face,
Recognition, Iris Recognition, Voice and Finger cannot be kept secret. With the advancement of technology there is every possibility for the intruders to steal the Biometric credentials and thus create fake and introduce them as real in the Biometric Database, which is a threat to Biometric Systems. This paper gives information on various spoofing attacks and explains the importance of liveness detection of human Biometric traits.
Design and Analysis of Using a Programmable Network Interface for High Speed Networks beyond the 10 Gbps [Full Text]
Mohamed Elbeshti1 , Michael Dixon and Terry Koziniec
The networks speed has been advancing rapidly in providing higher transmission rate 10 Gbps and over. These improvements based on the demand of enhancing the network services, improving their bandwidth and integrating advanced technology. As the speed of networks exceeds 10 Gbps, the design and implementation of high-performance Network Interfaces (NI) for the current and the Next Generation Network (NGN) server applications that employ TCP/IP and UDP/IP as the communication protocol of choice is becoming very challenging. Using the General Purpose Processor (GPP) as a main core processor in the NI to offload the TCP/IP or UDP/IP functions, can deliver some important features to NI such as scalability and short developing time. However, it is not clear that using GPP can support the new speed line over the 10 Gbps. Furthermore, it is necessary to find out the clock rate Hz limit of these GPP in supporting the processing of NIs. In this research, we have proposed a new programmable Ethernet NI (ENI) model design to support the high speed transmission. This model supports the Large Segment Offload (LSO) for sending side and a novel algorithm for receiving side called Receiving Side Amalgamating Algorithm (RSAA). As a result, a 240 MHz RISC core can be used in this Ethernet NI card for a wide range of the transmission line speeds up to 100 Gbps when a jumbo packet assigned as a default size for a network.
Design and Implementation of Extendable Media Streaming Application: Case Study of Windows 2003 Voice Recorder [Full Text]
Ezekiel U. Okike and Ayorinde Afolayan
Any data that changes meaningfully with respect to time can be characterized as time-based media. Such media data can be obtained from a variety of sources, such as local or network files, cameras, microphones, and live broadcasts. A key characteristic of timebased media is that it requires timely delivery and processing such that once the flow of media data begins, there are strict timing deadlines that must be met both in terms of receiving and presenting the data. For this reason, time-based media is often referred to as streaming media and is delivered in a steady stream that must be received and processed within a particular timeframe to produce acceptable results. This study evaluates a streaming medium, the Window’s sound recorder application, investigates the problem of limited recording period of 60 seconds, and implements a Unified Modeling language (UML) based design framework for an extendable media recorder application which overcomes the 60 seconds recording limitation of the system.
Development and Deployment of Fixed Wireless Access in South West Nigeria: Performance and Evaluation [Full Text]
Oluwaranti Adeniran and Achimugu Philip
TD-PSOLA Based Emotional Speech Generation [Full Text]
A. Manpreet Kaur and B. Parminder Singh
This paper describes a speech generator system that is able to generate synthetic speech from a natural recorded speech using TD-PSOLA. Generating emotions in speech is an imperative issue of research giving the requirement of modern human-machine interaction systems to produce an emotional speech. This system implements TD-PSOLA (Time Domain Pitch synchronized Overlap-add) algorithm which is a well known technique for high quality pitch and time scale modifications of speech. An important part of this algorithm is pitch marking and prosodic modifications which is used for embedding emotions in speech signals. TD-PSOLA is chosen for adding emotions to natural recorded speech as it has very low complexity and computational cost.
The Impact of Information Technology in Nigeria’s Banking Industry [Full Text]
Oluwagbemi Oluwatolani, Abah Joshua and Achimugu Philip
Today, information technology (IT) has become a key element in economic development and a backbone of knowledge-based economies in terms of operations, quality delivery of services and productivity of services. Therefore, taking advantage of information technologies (IT) is an increasing challenge for developing countries. There is now growing evidence that Knowledge-driven innovation is a decisive factor in the competitiveness of nations, industries, organizations and firms. Organizations like the banking sector have benefited substantially from e-banking, which is one among the IT applications for strengthening the competitiveness. This paper presents the current trend in the application of IT in the banking industries in Nigeria and gives an insight into how quality banking has been enhanced via IT. The paper further reveals that the deployment of IT facilities in the Nigerian Banking industry has brought about fundamental changes in the content and quality of banking business in the country. This analysis and clarification of how Nigerian Banks have used IT to reengineer their operations is detailed through literature review and observation. Three categories of variables that relate to the use and implementation of information technology devices were considered in this paper. These include the nature and degree of adoption of innovative technologies; degree of utilization of the identified technologies; and the impact of the adoption of IT devices on the bank operations.
Development of a Window Based Security System for Electronic Data Interchange [Full Text]
Achimugu Philip, Oluwagbemi Oluwatolani and Abah Joshua
The Electronic Data Interchange (EDI) is the exchange of standardized documents between computer systems for business use. The objective of this study is to make Electronic Data Interchange secure to use and to eliminate human intervention in the transfer of data between business partners so that productivity and efficiency can be improved and also promote its usage between two or more trading organizations. This paper provides an overview of EDI by describing the traditional problems of exchanging information in business environments and how the EDI solves those problems and gives benefits to the company that makes use of EDI. This paper also introduces the common EDI Standards and explains how it works, how it is used over the internet and the security measures implemented. The system was executed on both local area network and wide area network after a critical study of the existing EDI methods and also implemented using VB.Net programming language. Finally, an interactive program was developed that handles the transfer of files, with special attention to the security of the items that are being transferred from one computer workstation to another.