Friday, February 24, 2012

Example of an Organization Installing an ERP Package


Enterprise Resource Planning (ERP) System

Enterprise Resource Planning or ERP is an information system that integrates all data and processes of an organization into a single and centralized system. It is a broad set of activities that helps integrate all functions running in an organization such as marketing, manufacturing, supply chain management, human resources and costumer relations management.

Taro Pharmaceuticals (Israel)
About Taro


Taro was established in Haifa in 1950. Since then, Taro has produced many products sold in Israel and abroad and thus serves as a source of pride for the pharmaceutical industry in Israel. (http://www.taro.co.il/)

The Challenge – Efficiency and Innovation

Chief Information Officers in the pharmaceutical industries are planning to deploy business model straining to operate at scale. By doing so, pharma companies are asking their IT experts to do two things at once, these are: Dramatically improve the efficiency of IT and Use It to drive business innovation.

The Solution – Enterprise Resource Planning

The implementation of an ERP solution by the pharmaceutical firm potentially increases the effectiveness and improves the efficiency of the processes and streamlines internal operation.

Business Case for ERP Implementation

Implementing an ERP may have pros and cons. Many companies find ERP systems to extremely useful and help them in thorough decision making. Others discover that the company having an ERP system just multiplies the cost and schedule delays of their business transactions.

Implementation Phases

The implementation of ERP in a pharmaceutical industry follows a generic approach. In this approach there are 6 phases, which are

§ Planning
§ Requirements analysis
§ Design
§ Detailed design
§ Implementation
§ Maintenance

Planning

The planning phase includes the need assessment and business justification. The business justification for ERP includes both tangible and intangible benefits, including inventory reduction, operating cost reduction etc.

Requirement analysis 

Requirement analysis includes:
§ Analyzing business process
§ Specifying the processes to be supported by the ERP.

Design 

The fundamental decision in ERP systems design is re-engineering versus customization. In the re-engineering approach, the team selects a commercial off the shelf ERP and re-engineers business processes to fit the package. In the customization approach, the team selects a commercial ERP and customizes the ERP to meet unique requirements.

Detailed Design 

"Best practices" methodology provides models supporting the business processes of each functional area. The process for using the best practices involves the following steps:

§ Select applicable business processes
§ Discard inapplicable processes
§ Those processes that do not match the system will serve as foundation for re-engineering
§ Identify any areas not covered by the best practices as candidates for customization
§ It also involves interactive prototyping and extensive user involvement in determining systems design elements.

Implementation

ERP implementation could include issues like Configuration, Migration of Data, Building Interfaces, Implementing Reports, Pilot testing, Going Live, Training. Many companies contract with a technical support specialist from the software supplier to assist in implementation. Also following issues in Configuration should be taken care off. These issues range from Data Ownership, Distribution of procedures and transactions to Data Management. Some of the implementation strategies followed are Big Bang Approach, Phased Approach and a mixed approach.

SAP
Business Scenario Groups


§ Time to Market
Research and Development in the Life Sciences industry includes all processes of medicine discovery, scientific development, and preclinical and clinical evaluation. SAP's solutions enable customers to optimize investments, by assessing budget, costs and risks as well as forecasting and planning resource capacity. Customers can realize increased efficiency and accelerated development times by capturing critical information across development (e.g. analytical testing, product & packaging specifications), optimizing the Scale-up process and reducing the clinical trial supply cycle time and regulatory submission compilation time. Productivity can be enhanced by efficient collaboration within the company and with business partners.

§ Product Quality
As regulatory agencies continue their close global vigilance on the manufacturing practices in the Life Sciences industry to improve drug quality and safety, companies are also under increasing cost pressures that forces them to manufacture their products more efficiently. A key differentiator is the ability to enforce compliance throughout the manufacturing process. Achieving this goal requires the ability to improve product quality by decreasing the manufacturing variability therefore reducing the risk of non-compliance. An increase in production efficiency can be realized by integrating key information across the organization including capturing quality and equipment information into one system of record that can be easily maintained. This single view of the manufacturing data and processes will lead to a consistent and compliant batch record.

§ Time to Value
Pharmaceutical companies require the ability to decrease the time from drug launch to peak sales to maximize product revenues, not only by increasing their sales effectiveness but also optimizing their supply chain planning. Successful companies adopt a customer centric approach in order to integrate new multi-channel marketing capabilities with the classic detailing approach. This allows for better segmentation of physicians through analytical insights and ensures compliance for drug sampling. Flexible and responsive supply chain planning is critical to sustain revenues in an increasingly competitive market. Additional revenue generation can be gained by effectively managing chargeback and rebate volumes through tight control of contract and pricing guidelines helping to reduce overpayment thus improves profitability.

§ Product Safety
Regulatory agencies in the Life Sciences industry are vigilant in assuring the safety of medical products at all stages in the product lifecycle - manufacturing, distribution and consumption or usage. Those companies that excel at proactively managing product safety continuously improve their product quality. The solutions help customers maintain one global system with consistent information regarding all complaints; gain visibility on the progress of investigations and increase speed of communication and issue resolution. Reporting and trend analysis on a global level allows companies to anticipate potential product safety issues by utilizing early warning signals to support corrective actions. The goal of drug tracking and tracing is to prevent unsafe product from entering the supply chain by establishing a secure electronic pedigree so that every unit of medication is authenticated.

ERP Implementation compared to Standard SDLC

The ERP in the company based its implementation follows the generic approach in implementation phase, similar to SDLC. It has a Planning phase, Requirement Analysis phase, Design phase, Implementation phase and Maintenance phase. Additional phase is present in the company’s implementation phase and that is Detailed Design phase.




Source:
http://www.nickmutt.com/what-is-erp.htm
http://www.taro.co.il/
http://knol.google.com/k/erp-in-pharmaceutical industry#6(2E)0_Product_profiling_(28)mysap_suite(29)

Sunday, February 19, 2012

Characteristics in Defining Deployment Environment

Considering that I was tasked by the IC-dean to evaluate the enrollment system of the university, there are some things that need to be considered in order for the enrollment system to work inside the university. One of the primary considerations after developing the enrollment system is the deployment environment of the application. The most common things that you need to consider would be the compatibility of the developed system to the system requirements, the considerations on the hardware and the software that will be used, the network planning, the security configuration, the cost and schedule of the deployment, and the deployment itself. In order for me to clearly evaluate those things, let me first define what is the deployment phase and how is it important after the system has been developed.


By definition, system deployment is a state where in delivery, installation, and testing of a computer or system occurs to put it in the state of operational readiness. In addition, system deployment is a process of transforming a mechanical, electrical, or computer system from a packaged form to an operational state. In other words, software deployment is the process of showing the outcome of what the IT team has been doing for many days, once a system has already been developed, system deployment occurs after that.

System deployment phase is one of the critical phases in developing a system. It is where the developed phase is passed to the clients and undergo to testing and clarifying for errors and bugs. Deploying a system does not easily mean that you, the IT team, pass the system to the clients and after that, you don’t care what happens next. The deployment of system needs critical analysis on how to deploy it, what things are needed to consider in deploying it, the requirements in deploying the system, and especially the cost and time needed to analyze in order to deploy the system properly.


According to Microsoft Developer Network (MSDN), when choosing a deployment strategy, the one who is tasked to engage with the system must do the following:
*Understand the target physical environment for deployment,
*Understand the architectural and design constraints based on the deployment environment, and
*Understand the security and performance impacts of your deployment environment.

Based on the above information that I gathered in relation to system deployment environment, I can now enumerate the characteristics examined when choosing or defining a system deployment environment. Here are some of the characteristics that need to examine:

Compatibility among System Requirements. The systems analyst must consider the systems requirements in deploying a certain system. In enrollment system, accessing and updating information is the common operation being done in the system, there must be a consideration on the speed of access and update so that there would be no delay on retrieving information. Compatible database must also include in considerations because information of students are stored in a database.

Compatibility among Hardware and System Software. Hardware and software are included in the consideration since the system software being deployed must match to the hardware of the computer. In relation to the enrollment system in USEP, most of the computers are under Microsoft operating system, which simply means that the system is well suited to computers using Intel processors. By considering compatibility between hardware and software, it could improve performance of the system, and minimize cost of maintenance since it has been ensured that there would be no compatibility issue between the two.

Consideration on Network Planning. In enrollment system, one of the primary reasons on having a system is to interconnect different departments in such a way that there would be unity in the information being processed. In this case, the systems analyst must consider the connectivity methods inside the university. If the university would go to wireless networks, consider the vulnerabilities of having a wireless network in traveling such crucial and important details like the grades of the students and the basic information of the student which is basically confidential.

Security configuration. The systems analyst must consider the security of the system. The accessibility of the system must be limited only to the ones handling the information. In the enrollment system, the system must be deployed in the departments that are included in the enrollment system so that they could be the only persons to access the file.

Cost and Schedule for Deployment. One of the salient things needed to be considered in deploying a system is the cost and schedule for deployment. As an analyst being tasked to deploy a system in a university, I must consider the budget of having a system in the university. The cost and time are relatively parallel to each other, when the cost of deployment is high; the schedule of maintaining the system would be compromised. But as an analyst you should consider both the cost and the maintenance of the deployment according to the system.


Sources:
http://books.google.com.ph/books?id=-ot62DeCKO4C&pg=PA309&lpg=PA309&dq=characteristics+that+an+anlayst+examines+when+choosing+or+defining+deployment+environment&source=bl&ots=V0x_RrJEXy&sig=ps0TRNmjo8XOHg7SB59vcqAN7Ng&hl=tl&ei=VTSOS9S2O4SQtgPk4qnZCA&sa=X&oi=book_result&ct=result&resnum=2&ved=0CA0Q6AEwAQ#v=onepage&q&f=false

http://msdn.microsoft.com/en-us/library/ee658120.aspx

Friday, February 10, 2012

Evaluating a Data Flow Diagram Quality


The previous assignments we had in our subject were related to diagrams, particularly the activity diagram and the different types of data flow diagram. In assignment 8 and 9, we were tasked to design activity diagram and data flow diagram for the pre-enrollment system of USEP. Here in assignment 10, we are tasked to discuss the characteristics of how an analyst examine when evaluating Data Flow Diagram quality. Before I discuss what characteristics an analyst examines when evaluating Data Flow Diagram (DFD) quality based on the assignment 8 and 9, let us first understand what a DFD is and how is it important in the preliminary phase in a certain project. 


Understanding Data Flow Diagram… 

A Data Flow Diagram (DFD) is graphical representation that shows the process, flow of data, and other external entities that works within the system. DFDs transform information from input to output in the system. By the name itself, data flow diagram, it consists of different symbols that have specific usage in order to identify the flow of the data. The most commonly used symbols are the arrows, which represent the data flow; the rectangular box, which represents the external entities, the source or the destination of the data; the rounded squares, which represent the process of data, from input to output. Some points are needed to be considered when dealing with data flow diagrams, where the information being inputted came from and where would the information go after being processed; what would happen to the data as it enters in the system and what would be the possible outcome of it; and where would the output go after the process. There are different levels of dataflow diagram, the Level 0 diagram or the context diagram and the Level 1 data flow diagram. The Level 0 diagram or the context diagram shows the direct flow of data from the source to its destination. It depicts a single process of the system with different source of data. The Level 1 data flow diagram shows the system’s preliminary processes, the data stores, sources, and destinations of data linked by data flows. 

Even if data flow diagrams are useful to the analysts when it comes to planning n making a system or handling a project, there are still pros and cons in dealing with DFDs. 

Pros… 

* DFDs give further understanding on the underlying system and sub-systems. 

* It basically gives the over-all summary of the system. 

* It serves as one of the blueprints of the project, thus making it an important file. 

* It serves as guide to other members of the team since developing a system is not part of the work of an analyst. 

* Since DFD is a graphical representation of data flow, it would be easy for the team to trace errors when handling the system. 

Cons… 

* When a client desires to have a complicated system, the first problem would be making a complicated data flow diagram also to ensure the proper flow of data in the system. 

* Some data flow diagrams that are designed may not be followed thoroughly because in actual designing of the system, it could be interpreted as a complex system. 

* DFDs that may not be designed clearly can confuse the other team in understanding the flow of the system. 



Data Flow Diagrams: Systems Analysts’ helpful tool 

The complexity of a certain system can be visualized with the use of the data flow diagrams. It is a very helpful tool for the systems analysts especially in the preliminary phase in planning to create a system. For me, data flow diagrams could be very complicated when in terms of the number of systems included, the external entities related to the system and the flow of each data from the source to the destination. But on the helpful side of data flow diagram, the analysts can visualize the flow of the data in a system. He can anticipate the problems that may arise as the flow of system is analyzed. The system would be at risk if the systems analyst would not deal with the data flow diagram first before going to design the system. Every planned system of the systems analyst must have DFDs to support their planned system. 



On Evaluating Data Flow Diagram quality… 

Now on the main purpose of this assignment, what characteristics does an analyst examine when evaluating data flow diagram quality? To answer this question, here are some of what I did and anticipate in order for my data flow diagram to be understood (hopefully…) by others. 

Data Flow Diagram must be readable. According to our interview with Sir Rex Bugcalao, the work of a systems analyst is to design a system that is fit to what the client wants. The designing includes creating a data flow diagram of the system. Data flow diagrams are essential in designing a system since it is the basis for the flow of the system. In order to attain the principal use of the DFDs, it should be readable enough to understand the data flow of the system even if the one looking at the diagram is not an analyst or a part of the team, it should be understandable enough so that there would be no hassle in the part of other team members and for the future developers of the system. 

Minimal Complexity in DFDs. Most complex systems come from complex plan on the data flow diagram since it is the basis in designing the system. Making a data flow diagram would really take time and understanding to include all possibilities that the system may execute and perform. Too much information and possibilities that the team considers would be a burden in their part because it could lead to misunderstanding since a lot of information must be dealt with. In order for this not to happen, the analyst will just deal with the levels of DFDs, deliver the diagram clearly and not to commit misunderstanding at the start of the preliminary phase. 

Consistency of the DFDs. The DFD is an essential tool in creating the system needed by the client. Any form of inconsistencies in the DFD can result to a dilemma in developing the system. First problem would occur on the members of the team in developing the system. The plan would start from the systems analyst, which includes the making of the data flow diagram and other diagrams that depicts the system. In order for the members to start what they supposed to do in the system, they first look at the DFDs made by the analyst to know the flow of the system. The members should be critical enough in understanding the flow of the system since they are the ones who will develop the said system. If one of the entities does not correspond to the DFD designed, it could create inconsistency in the planned system and the actual system. Second problem may occur when after the development of the system; the client tends to change some of the details in the expected system. The designed DFD now does not jive from the expected system of the client. This will cost time for the systems analyst since everything is set but there arrived changes in the plan. 

Inconsistencies inside the DFD can be identified easily by merely looking at the structure of the diagram. Some of the common consistency errors are pointed by Satzinger and others. 

* The different approach on the data flow content between the physical process and the internal process of the system or the process decomposition. For example, an enrollment data flow that appears on a level-1 diagram, but not seen on the level-0 diagram. That would create a consistency violation on the DFD. 

* The input data that does not correspond to the output data could lead to inconsistency in delivering the data inside the system. 

* The output data that is not expected from the given input data is also considered as inconsistency in the flow of data in the system. 


Understanding Black holes, Grey holes, and Miracles. There are several diagramming mistakes that may occur in creating a DFD. These mistakes should be critically examined by the analysts and it should be primarily considered to ensure that the designed DFD is capable of delivering the expected system needed by the client. Black holes and miracles are commonly mistakes that can be evaluated easily. Black hole is a situation where in the processing step has an input flow but there is no matching output flow. For me, it is called black hole since the process of the input flow leads to an unidentified outcome. Miracle is another situation where in the processing step may derive an output flow but no corresponding input flow. Same as the black hole, it is called miracle since there is no specified input flow but there arise an output flow of data, which creates a question, where could this output flow came from without having a corresponding input flow? Lastly is the grey hole, a situation where in the processing step may haw an output but is not adjacent to the input, it may be that the output may be greater than the sum of its inputs. 





Source: 
http://www.hit.ac.il/staff/leonidm/information-systems/ch24.html 
http://it.toolbox.com/blogs/enterprise-solutions/data-flow-diagrams-dfds-14573 
http://faculty.babson.edu/dewire/Readings/dfdmistk.htm