Friday, February 10, 2012

Evaluating a Data Flow Diagram Quality


The previous assignments we had in our subject were related to diagrams, particularly the activity diagram and the different types of data flow diagram. In assignment 8 and 9, we were tasked to design activity diagram and data flow diagram for the pre-enrollment system of USEP. Here in assignment 10, we are tasked to discuss the characteristics of how an analyst examine when evaluating Data Flow Diagram quality. Before I discuss what characteristics an analyst examines when evaluating Data Flow Diagram (DFD) quality based on the assignment 8 and 9, let us first understand what a DFD is and how is it important in the preliminary phase in a certain project. 


Understanding Data Flow Diagram… 

A Data Flow Diagram (DFD) is graphical representation that shows the process, flow of data, and other external entities that works within the system. DFDs transform information from input to output in the system. By the name itself, data flow diagram, it consists of different symbols that have specific usage in order to identify the flow of the data. The most commonly used symbols are the arrows, which represent the data flow; the rectangular box, which represents the external entities, the source or the destination of the data; the rounded squares, which represent the process of data, from input to output. Some points are needed to be considered when dealing with data flow diagrams, where the information being inputted came from and where would the information go after being processed; what would happen to the data as it enters in the system and what would be the possible outcome of it; and where would the output go after the process. There are different levels of dataflow diagram, the Level 0 diagram or the context diagram and the Level 1 data flow diagram. The Level 0 diagram or the context diagram shows the direct flow of data from the source to its destination. It depicts a single process of the system with different source of data. The Level 1 data flow diagram shows the system’s preliminary processes, the data stores, sources, and destinations of data linked by data flows. 

Even if data flow diagrams are useful to the analysts when it comes to planning n making a system or handling a project, there are still pros and cons in dealing with DFDs. 

Pros… 

* DFDs give further understanding on the underlying system and sub-systems. 

* It basically gives the over-all summary of the system. 

* It serves as one of the blueprints of the project, thus making it an important file. 

* It serves as guide to other members of the team since developing a system is not part of the work of an analyst. 

* Since DFD is a graphical representation of data flow, it would be easy for the team to trace errors when handling the system. 

Cons… 

* When a client desires to have a complicated system, the first problem would be making a complicated data flow diagram also to ensure the proper flow of data in the system. 

* Some data flow diagrams that are designed may not be followed thoroughly because in actual designing of the system, it could be interpreted as a complex system. 

* DFDs that may not be designed clearly can confuse the other team in understanding the flow of the system. 



Data Flow Diagrams: Systems Analysts’ helpful tool 

The complexity of a certain system can be visualized with the use of the data flow diagrams. It is a very helpful tool for the systems analysts especially in the preliminary phase in planning to create a system. For me, data flow diagrams could be very complicated when in terms of the number of systems included, the external entities related to the system and the flow of each data from the source to the destination. But on the helpful side of data flow diagram, the analysts can visualize the flow of the data in a system. He can anticipate the problems that may arise as the flow of system is analyzed. The system would be at risk if the systems analyst would not deal with the data flow diagram first before going to design the system. Every planned system of the systems analyst must have DFDs to support their planned system. 



On Evaluating Data Flow Diagram quality… 

Now on the main purpose of this assignment, what characteristics does an analyst examine when evaluating data flow diagram quality? To answer this question, here are some of what I did and anticipate in order for my data flow diagram to be understood (hopefully…) by others. 

Data Flow Diagram must be readable. According to our interview with Sir Rex Bugcalao, the work of a systems analyst is to design a system that is fit to what the client wants. The designing includes creating a data flow diagram of the system. Data flow diagrams are essential in designing a system since it is the basis for the flow of the system. In order to attain the principal use of the DFDs, it should be readable enough to understand the data flow of the system even if the one looking at the diagram is not an analyst or a part of the team, it should be understandable enough so that there would be no hassle in the part of other team members and for the future developers of the system. 

Minimal Complexity in DFDs. Most complex systems come from complex plan on the data flow diagram since it is the basis in designing the system. Making a data flow diagram would really take time and understanding to include all possibilities that the system may execute and perform. Too much information and possibilities that the team considers would be a burden in their part because it could lead to misunderstanding since a lot of information must be dealt with. In order for this not to happen, the analyst will just deal with the levels of DFDs, deliver the diagram clearly and not to commit misunderstanding at the start of the preliminary phase. 

Consistency of the DFDs. The DFD is an essential tool in creating the system needed by the client. Any form of inconsistencies in the DFD can result to a dilemma in developing the system. First problem would occur on the members of the team in developing the system. The plan would start from the systems analyst, which includes the making of the data flow diagram and other diagrams that depicts the system. In order for the members to start what they supposed to do in the system, they first look at the DFDs made by the analyst to know the flow of the system. The members should be critical enough in understanding the flow of the system since they are the ones who will develop the said system. If one of the entities does not correspond to the DFD designed, it could create inconsistency in the planned system and the actual system. Second problem may occur when after the development of the system; the client tends to change some of the details in the expected system. The designed DFD now does not jive from the expected system of the client. This will cost time for the systems analyst since everything is set but there arrived changes in the plan. 

Inconsistencies inside the DFD can be identified easily by merely looking at the structure of the diagram. Some of the common consistency errors are pointed by Satzinger and others. 

* The different approach on the data flow content between the physical process and the internal process of the system or the process decomposition. For example, an enrollment data flow that appears on a level-1 diagram, but not seen on the level-0 diagram. That would create a consistency violation on the DFD. 

* The input data that does not correspond to the output data could lead to inconsistency in delivering the data inside the system. 

* The output data that is not expected from the given input data is also considered as inconsistency in the flow of data in the system. 


Understanding Black holes, Grey holes, and Miracles. There are several diagramming mistakes that may occur in creating a DFD. These mistakes should be critically examined by the analysts and it should be primarily considered to ensure that the designed DFD is capable of delivering the expected system needed by the client. Black holes and miracles are commonly mistakes that can be evaluated easily. Black hole is a situation where in the processing step has an input flow but there is no matching output flow. For me, it is called black hole since the process of the input flow leads to an unidentified outcome. Miracle is another situation where in the processing step may derive an output flow but no corresponding input flow. Same as the black hole, it is called miracle since there is no specified input flow but there arise an output flow of data, which creates a question, where could this output flow came from without having a corresponding input flow? Lastly is the grey hole, a situation where in the processing step may haw an output but is not adjacent to the input, it may be that the output may be greater than the sum of its inputs. 





Source: 
http://www.hit.ac.il/staff/leonidm/information-systems/ch24.html 
http://it.toolbox.com/blogs/enterprise-solutions/data-flow-diagrams-dfds-14573 
http://faculty.babson.edu/dewire/Readings/dfdmistk.htm

1 comment: