Computer Aided Software Design Tools
Computer-Aided Software Engineering
Database Development Process
Ming Wang , Russell K. Chan , in Encyclopedia of Information Systems, 2003
I.G. Case Tools
CASE stands for computer-aided software engineering. CASE tools can be applied to support database development. There are three types of CASE tools: upper-CASE, lower-CASE, and integrated CASE tools:
- 1.
-
The upper-CASE tool supports database planning and design including data collection and analysis, data model generation and application design.
- 2.
-
The lower-CASE tool supports database implementation including data conversion, report generation, application code generation, prototyping, and testing.
- 3.
-
The integrated CASE tool supports all phases of database development and provides the functionality of both upper-CASE and lower-CASE in one tool.
CASE tools were created to improve the productivity and quality of database development. The advantages of using them are:
- •
-
Promoting the standards for database development for data, diagrams, documentation, and projects, making them easy to reuse and maintain
- •
-
Keeping data linked, consistent and integrated for the organization
- •
-
Reducing database development time since some CASE tools can automatically generate diagrams and application of an executable code based on the design
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B0122272404000265
Domain 4
Eric Conrad , ... Joshua Feldman , in CISSP Study Guide (Second Edition), 2012
Computer-aided software engineering (CASE)
Computer-aided software engineering (CASE) uses programs to assist in the creation and maintenance of other computer programs. Programming has historically been performed by (human) programmers or teams; CASE adds software to the programming team. There are three types of CASE software [3]:
- 1.
Tools support only specific tasks in the software-production process.
- 2.
Workbenches support one or a few software process activities by integrating several tools in a single application.
- 3.
Environments support all or at least part of the software production process with a collection of Tools and Workbenches.
Fourth-generation computer languages, object-oriented languages, and GUIs are often used as components of CASE.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9781597499613000054
Making and Using XML: The Data Managers' Perspective
Peter Aiken , David Allen , in XML in Data Management, 2004
CASE Technologies
CASE, which stands for computer aided software engineering, is a class of tools that is already heavily used in every type of organization. This particular classification is somewhat broad, but for the purposes of this discussion we will try to restrict it to software packages that help in systems development.
Figure 5.3 shows the first instance of a CASE tool incorporating what we consider desirable XML support. Since Visible Advantage has its origins with Clive Finkelstein, who is known as the father of information engineering, it was not surprising that it was the first to provide quality XML support. The figure shows just how integrated the process of managing the CASE tool metadata could be. While the entity "Person" is described in standard information engineering format, it can be simulta-neously displayed and output as XML. This degree of integration is one of the most important aspects of CASE tool usage, specifically maintaining the model (in this case, the metadata of "Person") separately from the way it is represented. Figure 5.4 illustrates the same degree of integration, in this case managing schemas concurrently with design objects.
In many ways, the capabilities of Visible Advantage's products are not representative of the CASE tool industry. CASE tool usage peaked around 1992–1993 when more than two-thirds of organizations used the technologies to help manage their metadata. Since 1993, CASE tool usage has slipped to less than one in three organizations! A number of data modelers have professed to us that PowerPoint is now their current favorite "CASE tool." The reason for the decline may be due in part to the lack of products desired by the data management community.
The drop in CASE tool usage has also been due to the usage myth illustrated on the left half of Figure 5.5. The old belief has been that all organizational metadata must "fit" into a single CASE technology. The problem with this is that access to the metadata from outside of the CASE tool has traditionally been very limited, which discourages widespread use. So it comes as no surprise that the valuable metadata in these tools is of limited value when few actually use it. Perhaps one of the reasons for the CASE tool myth is that software vendors have been eager to represent their products as something that could handle any situation, and the price tags on the products have provided added impetus for organizations to try to fit everything into one tool.
The "new" model of CASE tool usage is shown on the right of Figure 5.5. Notice how XML-based integration of the metadata is both the input and the architectural basis. If integrating data using metadata works, then consider how managing metadata with additional metadata will also help out. Integrated metadata sits in an open repository, ready for a variety of CASE-based tools and methods to operate on it as utilities. The subsequent metadata is widely accessible via web, portal, XML, database management system, and so on. The accessibility in turn drives continued use of the data.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780120455997500054
Perceptions of strategic value and adoption of e-Commerce: a theoretical framework and empirical test
Elizabeth E. Grandón , J. Michael Pearson , in Value Creation from E-Business Models, 2004
CASE
Adoption of CASE has also been addressed in the IT adoption literature. Premkumar and Potter (1995) examined the impact of various organizational and technology characteristics on the adoption of CASE tools. Within the organizational factors, they considered top management support, product champion, and IS expertise. The variables included in the technical factor were relative advantage, cost, complexity, technical compatibility, and organizational compatibility. Ninety IS managers participated in the field survey. Results revealed that the existence of a product champion, strong top management support, lower IS expertise, relative advantage of CASE technology over other alternatives, and a conviction of the cost effectiveness of the technology were found to be strong indicators of the adoption of CASE tools.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780750661409500102
Dynamic scenario-based approach to re-engineering of legacy telecommunication software1
N. Mansurov , R. Probert , in SDL '99, 1999
5.1 Description of the ToolExchange system
The ToolExchange implements an integration mechanism for extensible multi-component CASE environments. ToolExchange provides interoperability between loosely connected interactive tools by allowing them to perform remote services.
The ToolExchange supports the following model. Each tool has a unique symbolic name. When a tool dynamically connects to the ToolExchange it is registered at the ToolExchange as a "subscriber" with unique identifier. A service can be requested either based on a symbolic name or a unique identifier. When the service is requested via symbolic name, the ToolExchange first checks, if there is any active subscriber with such name. If an active subscriber exists, the ToolExchange sends the service request to it. If no active subscriber exists, the ToolExchange launches the tool using the "tool command line" for a particular symbolic name. When the service is requested via the unique identifier, the ToolExchange checks if the particular subscriber is still connected and sends the service request to it. The ToolExchange establishes a temporary connection between the service requestor and the service provider, by sending the unique identifier of the service requestor to the service provider, so that the later can send back the reply.
ToolExchange implements a simple lightweight text-based protocol for communication between tools (as opposed to e.g. CORBA [16]).
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780444502285500229
Using CASE Tools for Database Design
Jan L. Harrington , in Relational Database Design and Implementation (Fourth Edition), 2016
The Drawing Environment
To this point, you've been reading about the way in which the functions provided by CASE software can support the database design effort. In this last section we will briefly examine the tools you can expect to find as part of CASE software, tools with which you can create the types of documents you need.
Because many of the documents you create with CASE software are diagrams, the working environment of a CASE tool includes a specialized drawing environment. For example, in Figure 12.12, you can see the drawing tools provided by the sample CASE tool for creating ER diagrams. (Keep in mind that each CASE tool will differ somewhat in the precise layout of its drawing tool bars, but the basic capabilities will be similar.)
The important thing to note is that the major shapes needed for the diagrams—for ER diagrams, typically just the entity and relationship line—are provided as individual tools. You therefore simply click the tool you want to use in the tool bar and draw the shape in the diagram, much like you would if you were working with a general-purpose object graphics program.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780128043998000120
Introduction
Carol Britton , Jill Doake , in A Student Guide to Object-Oriented Development, 2005
UML and CASE tools.
One of the advantages of a standardized language for producing diagrams during system development is that a number of CASE tools have been developed to provide automated support for developers.
CASE (Computer Aided Software Engineering) refers to any piece of software that has been designed to help people develop systems. In theory, a CASE tool can be a simple drawing program or basic debugger, but today almost all CASE tools cover the whole of the system life cycle, and provide automated support for all development activities, both technical and managerial. For object-oriented systems, tools such as Rational Rose™ and Together™ allow developers to produce UML models of the system which are syntactically correct, consistent with each other and which can be refined and developed to produce executable code. The diagrams in this book were originally produced using both Rational Rose™ and Together™, but the code has been written from scratch in order to show how the system develops from diagrams to code, and to emphasize certain points that we think are important when you are learning about object-oriented development. However, it would have been perfectly possible to generate the skeleton of a running program from the diagrams shown here. Even though we did not use this facility, we still benefited from the 'nanny' characteristics of the CASE tools, which remembered details of previous diagrams, allowed us to navigate between diagrams that were related, and pointed out when we made stupid mistakes.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780750661232500013
Mass Simulations Based Design Approach and its Environment
Satoshi Miyata , ... Mihoko Fukumoto , in Parallel Computational Fluid Dynamics 2002, 2003
1 INTRODUCTION
Nowadays, computer simulations are perceived as one of standard design tools and many simulation programs have been used in design and manufacturing processes and they are called/recognized as Computer Aided Engineering(CAE). In the past decade, CAE software was run in step-by-step for most of users. In this procedure, a user must check each calculation result at each end of run and make some decisions and predictions that will indicate direction of subsequent simulation runs.
While, since middle of 1990', some new design methodologies are getting popular, typically, in automotive industry. For example, Design Of Experiment (DOE), Response Surface Model (RSM), Monte-Carlo Simulation (MCS) and many expressive meta-heuristics optimization algorithms, such like Genetic Algorithm (GA), Artificial Neural Net (ANN), Simulated Annealing (SA) and so on. Those new design approaches have been changing user's stance to CAE from traditional "step-by-step" style to "MASS based" style that runs numbers of CAE calculations all at once and get massive results.
In this paper, Genetic Algorithm based approach is discussed. To clarify its effectiveness, proposed approach is applied to multi-objective optimization problem of Diesel engine.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780444506801500632
Systems Design
Jeremy Rasmussen , in Encyclopedia of Information Systems, 2003
IV.C. Modeling
One of the techniques for evaluating design alternatives is modeling. The model for the preferred alternative will be expanded and used to help manage the system throughout its life cycle. Many types of system models are used, such as physical analogs, analytic equations, state machines, block diagrams, functional flow diagrams, object-oriented models, and computer simulations. A model is a representation of reality. Logical or functional models show what a system is or does. They are implementation independent; that is, they depict the system independent of any technical implementation.
Systems design engineering is responsible for creating a product and also a process for producing it. So, models should be constructed for both product and process. Process models allow engineers to study scheduling changes, create dynamic PERT charts, and perform sensitivity analyses to show the effects of delaying or accelerating certain subprojects. Running the process models reveals bottlenecks and fragmented activities, reduces cost, and exposes duplication of effort. Product models help explain the system. These models are also used in tradeoff studies and risk management.
The systems engineering process must partition the overall system into subsystems, the subsystems into assemblies, etc. Reusability should be considered in creating subsystems. For new designs, subsystems should be created so that they can be reused in future products. For redesign, subsystems should be created to maximize the use of existing, particularly commercially available, products.
Systems engineers must also decide whether to make or buy the subsystems, first trying to use commercially available subsystems. If nothing satisfies all the requirements, then modification of an existing subsystem should be considered. If this proves unsatisfactory, then some subsystems will have to be designed in-house. Engineers designing one subsystem must understand the other subsystems that their system will interact with. Typically, flexibility is more important than optimality.
This is the first phase in the procurement cycle where significant effort is allocated to develop hardware. In the prior two phases, the product was mostly on paper, e.g., in study reports. Now, the system begins to take shape and the systems engineering role broadens. Systems engineers are involved in key tasks such as preparing or upgrading top-level specifications, supporting preliminary designs of hardware and software, integration of subsystems tradeoffs and designs, and detailed development planning, including scheduling and the "waterfall" of ever more inclusive system design reviews.
During the systems engineering process architectures are generated to better describe and understand the system. The word "architecture" is used in various contexts in the general field of engineering. It is used as a general description of how the subsystems join together to form the system. However, systems design engineering generally recognizes three universally usable architectures that describe important aspects of the system: functional, physical, and system architectures (see Table V).
Table V. Architectural Types
• Functional architecture—Identifies and structures the allocated functional and performance requirements |
• Physical architecture—Depicts the system product by showing how it is broken down into subsystems and components |
• System architecture—Identifies all the products (including enabling products) that are necessary to support the system and, by implication, the processes necessary for development, production/construction, deployment, operations, support, disposal, training, and verification |
Defining the system architecture means choosing the high-level approach that will determine the components and subsystems of the system. Some of these approaches might include:
- 1.
-
Object-oriented design, or structured analysis/functional decomposition
- 2.
-
Distributed or centralized computing
- 3.
-
Use of commercial-off-the-shelf (COTS) components or custom-designed proprietary system
- 4.
-
Programming language (e.g., C++ versus Java)
Just as an architect or engineer may employ computer-aided design (CAD) to help construct building plans, an information system software analyst has a number of automated tools to help generate models. Computer-aided software engineering or CASE tools, which automate, manage, and simplify the software development process, encompass programs that summarize initial requirements, develop data flow diagrams, schedule development tasks, prepare system description documentation, control software versions, and even develop program code. Various organizations offer CASE software capable of supporting some or all of these activities. Some CASE tools provide special support for object-oriented programming, but the general term "CASE" may be applied to any type of software development environment.
After the high-level or preliminary design is concluded, analysis efforts continue and are intensified. Higher fidelity models and simulations of system elements and the entire system are developed and thoroughly exercised. Through the use of design of experiment techniques, a number of simulation runs and tests can characterize system and element performance under a wide variety of conditions.
One of the main considerations of systems design engineering is the interfaces that exist between subsystems and interfaces that exist between the main system and the external world. Subsystems should be defined along natural boundaries. When the same information travels back and forth among different subsystems a natural activity may have been fragmented. Subsystems should be defined to minimize the amount of information to be exchanged between the subsystems. Well-designed subsystems send finished products to other subsystems. Feedback loops around individual subsystems are easier to manage than feedback loops around interconnected subsystems.
IV.C.1. Object-Oriented Design
The term object-oriented (or OO) is generally used to describe a system that deals primarily with different types of objects, and in which the actions the system can take depend on the type of object being manipulated. The primary purpose of object-oriented design, or OOD, is to provide a sufficient description and specification to enable developers to build, deploy, test, and reuse system components. The design should be flexible enough to respond to changes in the business requirements and the implementation.
OOD is primarily concerned with developing an object-oriented model of a system to implement the identified requirements. Many OOD methods have been described since the late 1980s. The most popular OOD methods include Booch, Buhr, Wasserman, and the HOOD method developed by the European Space Agency. OOD can yield the following benefits: maintainability through simplified mapping to the problem domain, which provides for less analysis effort, less complexity in system design, and easier verification by the user; reusability of the design artifacts, which saves time and costs; and productivity gains through direct mapping to features of object-oriented programming languages.
The first step in doing object-oriented programming is data modeling, which is the analysis of data objects that are used in a business or other context and the identification of the relationships among these data objects. As a result of data modeling, a system has defined classes that provide the templates for program objects. Several approaches or methodologies to data modeling and its notation have recently been combined into the unified modeling language (UML), which is expected to become a standard modeling language.
The UML notation is derived from and unifies the notations of three object-oriented design and analysis methodologies: Grady Booch's methodology for describing a set of objects and their relationships, James Rumbaugh's object-modeling technique (OMT), and Ivar Jacobson's approach which includes a "use case" methodology. UML is an accepted standard of the Object Management Group (OMG), which is the author of the common object request broker architecture or CORBA. CORBA is the leading industry standard for distributed object programming. UML is a notation system that developers use to communicate about a common model, and it is developed from methodologies that also describe the processes used in developing and using the model.
The diagrams of UML represent typical artifacts that software developers might prepare to document an information system's organization, behavior, functionality, and design. The diagrams central to UML are given in Table VI and Figs. 4 and 5.
Table VI. UML Diagrams
• Class diagram—Represents the static structure of an information system by identifying the classes of the system. For each class, attribures, operations, and associations/relationships to other objects are identified. A class diagram is shown in Fig. 4. |
• Use class diagram—Depicts external users of an information system, the boundary of a systems, and all or a portion of the use cases in which the system is comprised. A use-case diagram is shown in Fig. 5. |
• Message trace diagram—Depicts single scenario of stimulus and response interchanges between a specific object and its external users, or between objects internal to a system. |
• Object message diagram—Depicts stimulus and response exchanges between two or more objects. |
• State diagram—Depicts aspects of the behavior of an information system and serves as formal specification of the behavior of a class. |
• Module diagram—Depicts elements of the object model in terms of their allocation to physical modules. |
• Platform diagram—Depicts the physical topology upon which a software system is designed to execute. |
IV.C.2. Structured Analysis
Structured analysis is a process-oriented approach. The technique is simple in concept: the analyst defines what the system should do before deciding how it should do it. The new systems specification evolves from a series of data flow diagrams. These diagrams show the flow and storage of as well as the processes that respond to and change data. While the OOD paradigm focuses on objects, classes, and inheritance, the structured paradigm focuses primarily on decomposing behaviors.
Structured analysis employs data flow and entity relationship models as well as process specifications and data dictionaries to provide a complete system design view. Structured techniques use a top down approach, starting with the overall system and decomposing it functionally to solve a specific problem. These analysis techniques are in sharp contrast to object-oriented techniques that focus less on functional decomposition and more on identifying objects from the enterprise domain and specifying their properties and behavior.
Structured analysis modeling typically uses process modeling, which involves graphically representing the functions, or processes, which capture, manipulate, store, and distribute data between a system and its environment and between components within a system. Data flow or sequence flow diagrams, which are high-level descriptions of the flow of data or events in the system, will aid in understanding system processes. Creating a good data flow or sequence flow diagram and its description is generally the first step in laying out a system design. Figure 6 shows a sample data flow diagram (DFD) and Fig. 7 shows a sequence flow diagram (SFD). The systems design engineer must collect typical sequences of events that the proposed system will go through. Sequences can be descriptions of historical behavior or can be based on mental models for future behavior. Such descriptions of input and output behaviors as functions of time are called sequence diagrams, behavioral scenarios, operational scenarios, operational concepts, operational sequences, use cases, threads, input and output trajectories, logistics or interaction diagrams.
DFDs can be unleveled (i.e., all entities, processes, stores, and vectors are shown in a single diagram) or leveled (i.e., showing a top-down devolution of complexity). In any leveled diagram, the same number (and identities) of inflows and outflows must exist. Figure 8 represents a typical situation in which a student borrows a book from a library that has a computerized library management system.
Another tool used in many structured analysis methodologies is the entity-relationship diagram or ERD. An ERD is a data modeling technique that creates a graphical representation of the entities, and the relationships between entities, within an information system. ERDs incorporate entities (which are the labeled boxes in a diagram) and relationships (which are the lines). Figure 9 shows a sample ERD.
First developed by Peter Chen in 1975, ERDs are widely used for specifying databases because they show the interrelationships between the data and help develop a conceptual model of the data. An ERD differs from a standard data flow diagram (DFD) in that an ERD attempts to categorize data into sets having like properties (and the relationship between the data sets), while a DFD attempts to model the data transformation and the processes needed to transform the data. ERDs also differ from the sequence flow diagrams described above, which attempt to model the time/state dependent behavior of a system and the changes of state.
Functional decomposition is the part of structured analysis that seeks to break a large, perhaps unwieldy problem down into smaller, more manageable chunks. Systems design engineers perform functional decomposition on new systems in order to map functions to physical components and system requirements and to ensure that all necessary tasks are listed and that no unnecessary tasks are requested. This list of functions becomes the basis for the work breakdown structure (WBS). A WBS is an indispensable planning tool that divides a project into tasks and subtasks. The tasks are numbered to indicate their relationship to each other and placed into a schedule.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B0122272404001799
Artificial intelligence-based computational fluid dynamics approaches
Vishwanath Panwar , ... Sampath Emani , in Hybrid Computational Intelligence, 2020
Appendix
Serial no. | CFD approach | Materials used/studied | Results | Turbulence models/algorithms/associated equations | References |
---|---|---|---|---|---|
18 | The loop of design–analysis–redesign in the optimization process | Computer aided designing (CAD)–computer aided engineering (CAE) computer-aided tools | The proposed model was found to yield significant reductions in monotonous repetitive tasks, as well as the time required to solve computation-intensive design optimization problems | Commercial CAD–CAE software | [19] |
19 | A novel piezoelectric flex transducer based on the Cymbal device | Energy-harvesting devices | From the optimal piezoelectric flex transducer harvesting device, findings indicated that an application of 2.0 as a safety design factor implies that the electrical power generated can be in the magnitude of up to 6.5 MW | Sequential quadratic programming on metamodels | [20] |
21 | A metamodel-based optimization approach employing several metamodels | Francis type turbine runner | From the results, it was reported that the proposed model reduces the design process time significantly. The reduction was found to be a factor of 9.2. It was also established that when optimization via the proposed approach is employed, there is a significant increase in turbine performance and also a reduction in the turbine blades' cavitation (often associated with lifespan reduction and harm to the turbine) | Metamodel-based optimization | [22] |
30 | Concurrence performance of the structural topology optimization and the deposition path planning | Additively manufactured parts | With most of the planned deposition paths aligned to principle stress directions, the combination was observed to enhance structural performance | Level set topology optimization algorithm and the iso-value level set contours | [32] |
35 | A data-driven method aimed at the efficient and combat representation of glider aerodynamics | Hand-launched free-flight glider airplanes | The proposed technique was found to be better placed to gain application in the inexpensive and easy design or creation of hobby-grade hand-launched gliders exhibiting creative shapes | Aerodynamic model | [35] |
36 | Gene expression programming (GEP) | Side weir discharge coefficient in rectangular sharp-crested side weirs | The best performance was found when the model with ratio of weir height to its length (p/y 1) parameters of R 2=0.947, RMSE=0.037, MARE=0.05, SI=0.067, and BIAS=0.01, ratio of weir length to depth of upstream flow (b/y 1), Froude number (F 1), dimensionless weir length (b/B) was used. Hence, GEP could be used to estimate the rectangular sharp-crested side weirs' discharge coefficient | Coefficient determination (R 2), BIAS, scatter index (SI), mean absolute relative error (MARE), and root mean square error (RMSE) | [36] |
37 | Charged system search (CSS) | Structural optimization problems | The hybrid CSS and PSO exhibited higher convergence and better performance | Hybrid charged system search and particle swarm optimization (HCSSPSO) | [37] |
38 | Adaptive neuro fuzzy group method of data handling | Pipe flows | Major parameters that were examined included the pipe diameter, the pipe friction coefficient, the average velocity, and the Reynolds number. Findings indicated that when compared to previous numerical solutions, the proposed relations proved simpler in terms of the effective evaluation of pipe flows' longitudinal dispersion coefficients | Particle swarm optimization-based evolutionary algorithm | [38] |
39 | The adaptive learning network used was a neuro-fuzzy-based group method of data handling (NF-GMDH) | With the central aim being the prediction of flow discharge, focus was on straight compound channels | The accuracy of prediction associated with NF-GMDH–GSA network was found to be superior to the case of the NF-GMDH–PSO network | Gravitational search algorithm (GSA) and particle swarm optimization (PSO) | [39] |
40 | Large eddy simulation (LES), Reynolds stress model (RSM), and k-epsilon renormalization group (RNG) | Water surface treatments [volume of fluid (VOF), porosity, and rigid lid] | The software embracing VOF and RSM depicted the best results regarding representation of counter-rotating secondary flow cells | Domain representations in terms of body-fitted coordinate (BFC) and Cartesian grids | [40] |
41 | Coupled computational fluid dynamic (CFD) and artificial neural networks (ANNs) | Plate-fin-tube heat exchangers. Study parameters included efficiency index j/f as a function of Reynolds (Re) numbers, Fanning friction f 0 factor, and Colburn j-factor | The main aim was to predict flow characteristics in the targeted heat exchangers. Results suggested that coupled CFD–ANNs prove robust and effective toward the plate-fin-tube heat exchangers' prediction of thermal–hydraulic performance | COMSOL Multiphysics software | [41] |
42 | The authors focused on CPU-based CFD solver and GPU-based CFD solver | 2D and 3D nonuniform steady laminar flow in domain based on convolutional neural networks | CNN estimates velocity fields faster than CPU-based CFD solvers and GPU-accelerated CFD solvers in four and two orders of magnitude, respectively, assuring further a cost of low error state | Convolutional neural network (CNN)-based surrogate model | [42] |
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780128186992000093
Computer Aided Software Design Tools
Source: https://www.sciencedirect.com/topics/computer-science/computer-aided-software-engineering
Posted by: ambrosinoagagedly.blogspot.com
0 Response to "Computer Aided Software Design Tools"
Post a Comment