ScholarBank@NUShttps://scholarbank.nus.edu.sgThe DSpace digital repository system captures, stores, indexes, preserves, and distributes digital research material.Thu, 03 Oct 2024 10:07:04 GMT2024-10-03T10:07:04Z50101Analysis of dataflow diagrams by neural networkhttps://scholarbank.nus.edu.sg/handle/10635/102858Title: Analysis of dataflow diagrams by neural network
Authors: Hsu, L.S.; Chan, S.C.; Teh, H.H.
Abstract: Given a hardware configuration with a number of processors and a dataflow diagram, one wishes to determine the best way to assign processors to the processes so that the total amount of time needed to complete an algorithm is at a minimum. General algorithm for solving such a problem does not exist. This paper presents a heuristic method which allows use to be made of neural networks to arrange the processes in a definite order. At any time during the execution, processes in the list can be classified into three types of status: 'done', 'ready' and 'waiting'. Whenever a processor has completed its task, it marks the completed process as 'done', check to see if any of the waiting processes can be elevated to 'ready' status, and picks the next ready process in the list for execution. This is repeated until all processes in the list are marked 'done'.
Fri, 01 Jan 1988 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/1028581988-01-01T00:00:00ZUncertainty computations in neural networkshttps://scholarbank.nus.edu.sg/handle/10635/99610Title: Uncertainty computations in neural networks
Authors: Hsu, L.S.; Teh, H.H.; Chan, S.C.; Loe, K.F.
Abstract: In a three-valued neural logic network, the strengths of the nodes are confined to the ordered pairs (1,0), (0,1) and (0,0). The first two pairs represent TRUE and FALSE respectively. The meaning of the third pair depends on the type of logic used. In Kleene's logic, (0,0) represents UNKNOWN. In Bochvar's logic, it represents MEANINGLESS. In this paper we introduced neural networks that allowed the strengths to be an ordered pair of real numbers the sum of which does not exceed one. Uncertainty is expressed by having a sum of less than one. This allows us to treat uncertainties in facts, rules as well as logical operations in a unifying way.
Mon, 01 Jan 1990 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/996101990-01-01T00:00:00ZTwo-valued neural logic networkhttps://scholarbank.nus.edu.sg/handle/10635/99609Title: Two-valued neural logic network
Authors: Hsu, L.S.; Loe, K.F.; Chan, Sing C.; Teh, H.H.
Abstract: A neural logic network that uses an ordered pair of numbers as its activation is introduced. The advantage is that it can deal with situations in which a proposition can be true, false, or unknown. The disadvantage is that computation is much more time consuming than networks whose activation consists of a single value. The network described in this paper is useful for situations where Boolean logic is sufficient.
Tue, 01 Jan 1991 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/996091991-01-01T00:00:00ZObject-oriented language for neural network simulationhttps://scholarbank.nus.edu.sg/handle/10635/111195Title: Object-oriented language for neural network simulation
Authors: Loe, K.F.; Hsu, L.S.; Chan, S.C.; Low, H.B.
Abstract: This paper describes an object oriented language for the simulation of neural networks on supercomputers. It is based on the general framework of Parallel Distributed Processing (PDP) proposed by D.E. Rumelhart and J.L. McClelland. The purpose of the design of our object oriented language is to provide a tool for the user to describe his network connections with ease. A preprocessor will translate the description into the source code of a high level language suitable for numerical computation. The latter will be compiled and executed on a supercomputer. In this way, it will inherit the user friendliness of P3 and at the same time bypass its number crunching bottleneck.
Fri, 01 Jan 1988 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/1111951988-01-01T00:00:00ZOn neural-logic networkshttps://scholarbank.nus.edu.sg/handle/10635/99355Title: On neural-logic networks
Authors: Chan, S.C.; Hsu, L.S.; Teh, H.H.
Abstract: The paper consists of two parts. The first part describes a class of networks called inference networks. An inference network is a directed graph with 'primitive', 'OR', 'AND' and 'NOT' nodes. 'Primitive' nodes can be given with any truth values from which the truth values of all other nodes can be calculated step by step through the paths. Inference networks can be used to model all kinds of logics. In the second part of the paper, another class of networks called neural-logic networks is introduced. A neural-logic network is also a directed graph whose nodes also represent logical statements and whose edges also represent logical links of these statements. However the nodes are not classified as OR nodes, AND nodes or NOT nodes. The truth-value of each node is an ordered pair (x, y) where x and y are non-negative real numbers with 0≤x+y≤1. The number x indicates the 'amount' of evidence that the statement is true while the value of y indicates the 'amount' of evidence that the statement is false. A neural-logic network is basically a neural network. However, it behaves very much like a logical system and yet contains all the features of an inference network.
Fri, 01 Jan 1988 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/993551988-01-01T00:00:00ZProbabilistic neural-logic networkshttps://scholarbank.nus.edu.sg/handle/10635/99583Title: Probabilistic neural-logic networks
Authors: Teh, H.H.; Chan, S.C.; Hsu, L.S.; Loe, K.F.
Abstract: Summary form only given. Recently, a novel class of networks called neural-logic networks was proposed by the authors' research group to integrate the logical reasoning capability with the concepts and techniques of the conventional neural network approach. The objective of this study is to extend the idea of the neural-logic network model to incorporate the theory of probability. This model will be able to predict the probability that the matching pattern or the possible set of solutions are correct. This gives the user an indication of the degree of reliability of the conclusion.
Sun, 01 Jan 1989 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/995831989-01-01T00:00:00ZImprecise reasoning using neural networkshttps://scholarbank.nus.edu.sg/handle/10635/99533Title: Imprecise reasoning using neural networks
Authors: Hsu, Loke-Soo; Teh, Hoon-Heng; Chan, Sing-Chai; Loe, Kia Fock
Abstract: A logic is defined that weighs all available information and implements it using an emulated neural network. This allows the resulting expert system to be able to learn through examples. It also handles fuzziness in the facts and the rules, as well as the logical operations, in a natural and uniform way. It is more realistic than the certainty factor formalism which leaves out information because it takes the minimum of the certainty factors for the AND operation and maximum of the certainty factors for the OR operation. In the present scheme, all activations are weighted and taken into account. Compared with classical expert systems, the present system has the advantage of operating in two modes. In the normal mode, rules are given by experts and weights are assigned values. In the learning mode, weights are allowed to vary while the system is fed with examples.
Mon, 01 Jan 1990 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/995331990-01-01T00:00:00ZMulti-valued neural logic networkshttps://scholarbank.nus.edu.sg/handle/10635/99561Title: Multi-valued neural logic networks
Authors: Hsu, Loke-Soo; Teh, Hoon-Heng; Chan, Sing-Chai; Kia, Fock Loe
Abstract: Two types of networks that are useful in developing expert systems are proposed. The probabilistic network can be used for predictive types of expert systems, whereas the fuzzy network is more suitable for expert systems that help in decision making. In both cases, the expert system can operate in two modes. In the normal mode, rules are given by experts and weights are assigned values. In the learning mode, weights are allowed to vary while the system is fed with examples.
Mon, 01 Jan 1990 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/995611990-01-01T00:00:00ZNeural three-valued-logic networkshttps://scholarbank.nus.edu.sg/handle/10635/99562Title: Neural three-valued-logic networks
Authors: Chan, S.C.; Hsu, L.S.; Brody, S.; Teh, H.H.
Abstract: Summary form only given. The authors propose a novel neural network called neural-logic network which attempts to incorporate both the pattern processing and the logical reasoning capabilities within a single frame of neural network environment. This class of neural networks can model classical 2-valued Boolean logic effectively. In addition, it is an ideal model to study 'human logic' which is multivalued, fuzzy, and biased. A neural application of neural-logic networks is the study of connectionist expert systems.
Sun, 01 Jan 1989 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/995621989-01-01T00:00:00ZTemporal neural logic networkshttps://scholarbank.nus.edu.sg/handle/10635/99602Title: Temporal neural logic networks
Authors: Teh, H.H.; Hsu, L.S.; Chan, S.C.; Loe, K.F.
Abstract: Neural logic networks are generalized to cater to logical systems where the validity of rules and facts changes with time. To construct a temporal network, the validity of rules and facts is collected at a selection of time instances to determine the connecting weights of the respective instances. The weight of the temporal network is then defined as functions that would produce the known values when the proper time is substituted. Three theorems on temporal pattern recognition are proved.
Mon, 01 Jan 1990 00:00:00 GMThttps://scholarbank.nus.edu.sg/handle/10635/996021990-01-01T00:00:00Z