Risk Assessment of Cloud Computing

Below, I have  Performed a short qualitative risk assessment of cloud computing that may be used as guidance for any company thinking of moving to the cloud.  Showing some of the strengths, weaknesses and benefits?

Risks

Risk Description Probability Impact Risk Affects
Lock-in Difficult to migrate from one service provide to the next. High Medium Company Rep., Data, Service
Loss of Governance Loss of some control  to CP and unknown roles Very High Very High Company rep, data, customers trust, service
Compliance Challenged Compliance with regulations and certifications Very High High Certifications, fines
Business Rep Loss Poor Service harms business during transition. Low High Company rep, service,  data
Cloud service Terminated Poor provider, lack of understandable terms N/A Very High Rep, trust, emp loyaty, service
Provider Acquisition Mergers and buy-outs of CP N/A Medium Rep, customer trust, emp exp, intellectual property, data, service
Supply chain Failure Lack of supplier redundancy Low Medium Company rep, customer trust, data, services
Technical Risk Over/under provisioning Medium Medium Access control, company rep
Malicious Insider Abuse of high privileges Medium Very High Company rep. data, employee and customer trust
Intercept Data in transit Weak encryption, vulnerabilities in cloud Medium High Company rep. data, intellectual property
Insecure/Ineffective deletion of data Proper sanitization or data Medium Very High Sensitive data, personal data
DDoS Distributed Denial of Service Attack Medium High Cloud Interface, Network, Customers, Company rep, service
Data Protection Staorage in multiple locations High High Company rep, data, service
Not part of CP Network Breaks Medium Medium Service
Social Engineering Lack of security & awareness Medium High Intellectual property, data, emp & customer trust, reputation.
Natural Disasters Lack or recovery plan Very Low High Back-ups, all of the above

 

Strengths & Benefits:

  • Security measures are cheaper when implemented on a large scale.
  • Data is replicated in multiple areas – increasing redundancy and independence from failure.
  • Local network problems are less likely to have global side effects.
  • Larger scale systems can develop more effective incident response capabilities.
  • Threat management is increased since the larger corporations that own the cloud can afford the generalists to deal with specific security threats that smaller companies cannot.
  • Reduces cost of running personal servers
  • Access to better technology

Weakness and Costs:

  • External CP will depend on network bandwidth
  • Integration of variety of software, integration can be very costly
  • Different configuration panel controls, learning curve for IT department
  • Configuring mixed modes between physical, virtual & cloud
  • Reports on performance could be hidden
  • May not integrate with current management controls

Reference:

Alex Gutman and Martin Perlin. (February 2011) 8 Cloud Building Conditions You Need for Taking your Data Center to the Next Level. www.evolven.com. Retrieved from: http://www.evolven.com/blog/8-cloud-building-conditions-you-need-for-taking-your-data-center-to-the-next-level.html

Daniele Catteddu, Giles Hogben. (n.d. Cloud Computing Risk Assessment — ENISA. Retrieved from http://www.enisa.europa.eu/act/rm/files/…/cloud-computing-risk-assessment

Naushad K. Cherrayil. (October 7, 2011). Cloud computing is the future of networking retrieved from http://gulfnews.com/business/technology/cloud-computing-is-the-future-of-networking-1.886905

Model View Control

Introduction

MVC

MVC

Model View Control (MVC)  is a 3 level architecture that decouples the interface from navigation and application behavior, mostly because keeping the applications together creates a huge mess when it is time to redesign you program. MVC patterns will simplify implementation and greatly enhance re-usability. It should always be used in O.O.P. (object oriented programming)

Model

The term Model stands for Data Module Objects, it holds all the application state information (ie data) and all operations that can modify the data.

A model is a computational approximation or abstraction of real world process, entity, or system. An example is the shopping cart when you order an item on-line using e-commerce. It would hold the information of the order number, what is being ordered, the quantity being ordered, and all the code that could interact with this data usually in SQL code. It is also called business logic and provides the connections to the data source as well as to the controller.

View

The view contains the interface functions, it is the GUI code. It will produce all the visual components of your program. It provides the access to the data and processing logic to the user.

It needs to allow the user enough functionality to provide the user with the tools the program is being developed for.

The view is tied into the Data model, if you delete an item from a shopping cart it will be removed immediate or upon a page refresh usually from an Event Object handler that contains the change and then updates the cart to show a page in HTML with the previously deleted item gone.

For an inexpensive home server try a Linux server with Apache Web-server, a MySQL database and PHP coding. All the software can be installed for free.

Controller

As its name implies, the controller component controls the overall flow. The controller code interacts with the view and model components to deliver a modular yet integrated solution.

It is the Controller that accepts input from the user in a particular modality, interprets that input (the interpretation may depend on the View), and invokes the appropriate operation on the Model.

For example, when the Controller detects a mouse click event on the “remove” button of an item it invokes the remove operation on that item. Any state changes that this operation causes on the

Model are sent by the Model to the registered Views via events. The controller component is normally written in Java and implemented as a Servlet.

 MVC usage rules

In order to support reusability, the interactions which do occur should be well defined and the dependencies between the elements (M-V-C) should be minimized. One of the goals of the MVC pattern is to enable the combination of a single Model with multiple Views and Controllers. The MVC pattern ensures that the Views are kept synchronized. When the Controller recognizes a valid command from the user’s input, it invokes the corresponding method on the Model. The

Model verifies that the operation is compatible with its current state, executes it and changes the state of the Views correspondingly. The views, as they have registered themselves as observers, get now informed about the Model’s state change and update their rendering correspondingly.

 The dependencies must be kept minimal

To support multiple views and controllers the dependencies must be kept minimal.

 Note: A is said to be dependent on B when the code of A embeds knowledge about B.

This leads to the following rules:

  1.  The Model does not have any dependency on Views or Controllers.
  2.   A View depends on its associated Model. It has to know the structure of the Model’s state to be able to render it.
  3.   A View does not have a dependency on Controllers. Therefore several different Controllers can be associated with the same View.
  4.  A Controller depends on its associated Model and View. The Model defines the operations the Controller can invoke and the View defines the context in which the Controller interprets the user input. This makes the Controller tightly coupled to the View.

 The interactions must be kept minimal

Another precondition to support multiple Views and Controllers is to keep interactions minimal.

In particular a Controller must never directly affect the rendering of its associated View. Instead user input must make a complete round trip through the Model before its effects become visible in the View. This rule guarantees that a state change updates all Views and that the Views remain synchronized. Often implementations with a single Controller violate this rule because of sloppy

thinking: “I already know that this state change will occur, and therefore do not need wait for the Model to tell me about it”. This is wrong for two reasons:

1. The Model can veto the operation for some reason. The operation will not occur.

2. Other Controllers may concurrently invoke operations on the Model. Some other operation can slip in between, which fundamentally changes the rendering and makes any assumptions about it invalid.

In addition it is impossible to extend such shortcut implementations later with additional Controllers.

 The MVC pattern in Web applications

Although the MVC pattern was originally devised for the organization of fat client GUI libraries, it has in the past several years received widespread acceptance as a suitable architectural pattern for implementing Web based solutions, too. Its structure has been applied, (with limitations), in recent Web applications. This is not surprising, since in both cases the separation of concerns is the driving force behind architectural choices.

 Extending the MVC pattern to distributed applications

Although the MVC pattern was originally devised for GUIs running on a single machine, it can be extended relatively straightforward to distributed systems, where some interfaces between Model, View and Controller may cross the network. The placement of the Model, the View and the Controller then becomes a crucial issue.

The client-centric approach puts all three functions: Model, View and Controller on each client device. The Model, which exists conceptually only in one instance, is “replicated” among the devices. A replication protocol keeps the copies of the Model synchronized.

The server-centric approach puts both Controller and Model on a server. The client devices contain only the Views.

 Why you should MVC

  • Separation of data from presentation

Drawbacks on MVC

  • Is not easy and requires planning
  • Thorough testing and more files needed
  • May overkill small applications

3 Reasons for BI

So you have been working with business intelligent tools for a while but when confronted by the COO on why you think they would be a good fit for your company you cannot figure out how to explain it to him/her.Here I have posted and describe three arguments you may use to persuade a business of the general value that Business Intelligence offers to most companies.

 1. Make faster decisions

BPMS LifecycleBI helps make better quality informative decisions at a faster rate than was done in the past.  It is not just for the IT staff, it is used by managers, executives, and consumers. One of the finalized reports in BI is the dashboard that allows for instantaneous perception of enterprise, departments and individuals performance, by bringing key metrics in a nice-looking and instinctive graphic interface. The best portion of a well advanced dashboard is the capability to drill down to underlying reports and apprehend what factors are contributing to good and bad performance. Another basic feature about dashboards is they permit you to effortlessly and constantly observe for exceptions, and alerts operators when to take action.

2. Report on the Now not the past

market BIWhile most reports can show you what has happened in the past, BI analytics can alert you to what is happening now and send out an alert. BI can also extrapolate possible future outcomes as well and all from a central location so that there is no relying on several different user reports from spreadsheets. The consistency of views is offered between all users because if the automatic data inputs. Because most of BI is automated, the accuracy of the data is also easier to trust. It is imperative for a corporation’s success to have detailed analysis of a corporation’s customers, business environment, stakeholders, business processes, competitors and several other sources of potential valuable information.

2. Future Insight

Bi dashboard exampleBI can offer future insight with predictive tools  so besides just viewing past and present information, you can also get a feel for what may happen in the future. Forecasting possible outcomes also gives  users the ability to be proactive.  Data mining allows analytics to be run on information that may have hidden patterns. Through simulations and collecting seemingly unrelated data, information can be revealed on what be approaching.

Conclusion

With BI you can increase employee productivity,  by empowering  employees with up-to-date reports that will help business decision making capabilities. Your business processes can be easily manage corporate wide from one spot. Relationships with business customers increase as well as the ability to increase market share, the companies IT department can reduce resources which reduces costs and helps deliver a more flexible department for developing and deployment of future cycles. The best way, in my own opinion, would be to provide several case examples from several different organizational implementations, from large scale to small, depending on what business you are trying to convince. If you are dealing with someone more tech savvy, then instead of just using dollars and cents, you could move on to actual business models that can help realize a business strategy.

Going Back to School

      This post informs of academic progression at college and the steps necessary to start, graduate and move on to a better job.

 Price of College Increases Every Year

Before you can take any classes you really have to decide if you are in it for the long haul. If you can’t afford the high rates of tuition plus unexpected charges, you should look into applying for financial aid, scholarships, and loans.

Technology has increased several folds since the 90’s, so has the way a college can educate its students. Because of the availability of alternative classes available to students that already have careers and/or families and are finding it easier to finish colleges.

In preparing this report I have spoken with financial counselors, academic counselors, and other students on what needs to be done in order to achieve your degree of choice as easy as possible. I already hold three associate degrees in Networking, Programming and Computer Applications, a bachelors in Computer Science and a Masters in Information Systems Management. I worked full time through all of these degrees, while raising 4 children.

 What Do You want to be doing 5-10 Years after College

 better jobsBefore you start taking classes you need to get a better understanding of what you like. Many people enter into fields of study just to change later on. There are several places online and at college campuses that will help guide you on the best career choice for you. Once you now what you want to do you can use that career choice to decide what type of education and diploma is necessary to pursue your future goals.

 The best place to start is always at the beginning. Before going back to college you should now what you want to do after college. Don’t chase dollar signs, it is best of you have a calling, a true desire to be in the field you choose and then pursuing a degree that will support this decision. Wither you have past experience or not.

 Extended Learning

 libraryThe nationwide availability of college programs gives busy working adults the opportunity to stay current in there chosen field and gives the additional education needed to advance in their career.

Available classes range from professional certificates to associate, bachelors, masters and doctoral degrees. Classes are offered in a variety of formats including evening and weekend classroom instruction, interactive television, mixed classroom instruction and online delivery as well as completely online instructions.

Were to get Help

https://bigfuture.collegeboard.org/college-search

The above address offers a list of classes by cities, selecting a city at or near your location will link to a page that will display programs available in that area. Make sure to call the student service office to verify if the program you are interested in is going to be available in the future. While pursuing a BS in Computer Information Systems (which is no longer available in my hometown), there was not a high demand and the satellite classes were moved to the closes major city area and I ended up having to drive 50 miles one way to class at night 2-3 times each week to complete my degree (WORTH IT).

Before you register you might want to know how close the nearest satellite campus is were you want to take classes and what they offer.

College counselors are usually available at least once a semester to help plan a course schedule. Find out when and make sure to schedule an appointment in advance.

 Preparing Your Finances

pile of moneyThere are numerous financial aid programs for students who need financial assistance. Programs are offered under federal, state, and institutions. Most people in middle class homes or lower income usually meet most requirements. Federal loans are also offered to supplement income or tuition while going to college.

I believe that you have to gamble a little in life and I rather bet on myself than a two dollar lottery ticket. That is why I have applied for the federal subsidized

loans to help pay for expenses while I go to college. I believe that I will be able to acquire a better paying career faster with a BS or masters than without.

As of July 2007 the total cost to take full time classes each semester plus expenses costs $8,893.00 and is expected to rise another 7% for the tuition each year. A lot of this expense can be minimized by living at home and finding books online. Still a lot of money no matter who you are but there is ways to work around some of the price that I will presenting.

 Applying for Financial Aid

 College prices are increasing at more that twice the rate of inflation for more than 20 years, the ability of many students and their families to pay for higher education is becoming national concern. Student financial services are becoming more critical in the college financial problem.

In order to be considered for financial aid you must first be admitted to a college. Make sure that you have completed your tax return and have a copy of it nearby. Then go to www.fafsa.ed.gov to apply for financial aid. It is necessary to do this as soon as possible since it could take at least 4 weeks to be approved and be contact by the college of your chose. There are also several reference guides that deal with merit, scholarships and other funding at most local and community college libraries.

Remember that you’ll need to fill out a new FAFSA before March 1st of each year. By meeting the priority deadline of March 1st, you can receive the maximum financial aid package you qualify for. If you don’t meet the priority deadline you may receive a lesser amount than you expect. Also if you want to check your financial aid status and see if there are any holds you can go online at most colleges and check.

Look for related links:

  1. student services
  2. student records
  3. student financial aid

 Student Loans

Another way to go is with federal loans, depending on what type of loan you apply for it could be have little no interest rate. Some don’t even charge interest until after you graduate. If you are considering applying for government backed loans, you will have to take loan entrance counseling and fill out a promissory note.

 Scholarships

There are several scholarships offered that vary by academic, financial need and community status. Most scholarships are awarded after summiting a summary on why you are requesting the scholarship.

Planning for Classes Well In Advance

 Not all classes are available at the same time or fit in with your schedule. The good news is that there are several night, weekend, and online classes that are design to fit all types of schedules. There are several guides and personnel that can help you develop a schedule of courses that fit to your schedule. Most classes can be taken at the nearest community college at a third the price of a 4 year college.

 When going back to college or attending for the first time as an alternative student you have to realize that it is an important lifestyle decision and that it will be stressful on you and your family.

family silouette In fact your ability to succeed depends a lot on how well your family co-operates, and is willing to be supportive and patient while you are away & studying. Make sure to thank them and appreciate what they are doing for you whenever possible.

Building your class schedule can be completed completely online using tools available at most colleges, even community colleges. Once you have a student number you will be able to access these sites. Always remember that taking these classes is a high priority. Failure in one class can affect you whole degree since graduating depends on your GPA not to mention the lack of available tuition assistance if your GPA falls.

 Registration

 I have tried desperately to find online when class registration begins and have failed each time. I can tell you on average class registration starts 4-5 months before the first day of class and that you should speak with a college representative around that time because they may be able to give you an exact date and help you register.

 Developing a Course Schedule

 

doctoral

As I previously mention advisers are usually available at each regional site every semester. Call you regional office to find out when your adviser will be visiting and to schedule an appointment. Talking with an adviser can save you a lot of time and money when planning on how to take your classes. It is not easy to know when classes will be available on-line and you usually need an academic adviser to admit you into the class you want to take. Pursuing my Masters, the student adviser was main person I talked to until I had to create me final thesis.

It is important not to overdue you first semester if you are a returning student. Maybe take one class for a semester and if it seems easy than take 2 or 3 the next semester. An important step when taking these classes is to talk t0

students who have taking the class before. They will be able to tell you if it was hard or easy and what the instructor expects.

 Finding Time for Everything

hard workingIf you are an alternative student it is usually because you have a family that you just can’t leave and go back to college full time. Going back to school as an alternative student is not just stressful for you but to your family to.

Keep your significant other informed, let her know as much as possible about what your learning and why it is important to there future. Remind that person that you would not be able to go to college at all if it wasn’t for there help with the kids. Make sure that if you have children that you let them in on what you are doing also. I let my family know that I was doing this to better not just my life but theirs also. Homework and classes on top of working 40+ hours a week and any other activities is really stressful but you have to make time for your family also. Try to finish your homework during the week even if you don’t get to watch your favorite TV show it is more important to have your Saturdays a little open to enjoy with friends and family so you don’t get burned out.

If you don’t think you will be able to finish an assignment on time you could talk to an instructor about getting an extension. Most college instructors realize that you have a job and are working harder than the regular student and want to help you find ways to achieve your degree. Even if this doesn’t sound like fun I have used some of my vacation days just to finish large assignments before they were do,

your not a teenager anymore you just can’t pull an all-nighter, doing so could jeopardize your job when you fall asleep at work.

Prepare for Post-College

 Congratulations on Graduating but you still need to find a job. Start by building a resume, finding a job, and preparing for that first professional interview.

2013 pay and unemployement

I added a graph to help motivate you a little more on getting your chosen degree. It shows the average  income levels & nu-employement based on education. The survey was done by the National Bureau of Labor and Statistics and published in 2013.

Education compared to Annual Income

 Creating a Resume

 A polished resume is crucial after graduating. Employers usually prefer a short to the point 1 page resume that contains all the pertinent personal information. Try to cut out the least important information to make it fit to one page. If you don’t have much work experience in your graduate field make sure to include your college degree at the beginning of your resume.

There are several places online that will help build your resume and even critique it for you. Check with you local community college also, there is usually a job placement counselor who is available and can be very useful. Check with the MyFSU student services tab online to find more information on job placement and resume building after graduation.

 Finding a Career

job search Most career searches are done online. Some of the best places to look online are:

Make sure to tell all your family and friends that you have graduated and looking for a new job. Some of the best positions near you will be found through social networking. You might of heard the expression “It doesn’t matter what you know, It’s who you know that matters.”

 Conclusion

I was offered several ways to achieve academic success which lead to professional success and a better life. I hope that I have made it a little easier to understand some of the steps in order to achieve success.

There are also several contact and extra information that are readably available on-line that you can pursue at you own leisure.

I do not believe that college is the only way, some very intelligent and not so have made it just by following their dreams.

A lot of positions require a higher degree of education…but not all.

 Works Cited

 Cassidy, Daniel J. The Scholarship Book 12th Edition. New York: Prentice Hall Press, July 2006

“Education and Income”. National Bureau of Labor and Statistics. 2002: 1-2

Gottesman, Greg. College Survival. New York: Prentice Hall, 1992

Osborne, AJ. Part-time clerical worker, Student Service Office for Extended Learning, Room 107. 20 April 2007

“University Center for Extended Learning: Ferris State University.” Ferris.edu, 07 July 2007, <http://www.ferris.edu/ucel/index>

Upcraft, Lee; Gardner, John; Barefoot, Betsy. Challenging and Supporting the First Year Student. California: Jossey-Bass, 2005

Ethics in Business Intelligence

Ethics

The type of ethics in business intelligence (BI) is the ethical principles of conduct that govern an individual in the workplace or a company in general. It is also known as professional ethics and not to be confused with other forms of philosophical ethics including religious conviction, or popular conviction. Professional ethics according to Griffin (1986) is that profit is not the only important strategy of a business anymore. There is also more of a concern and motivator of companies to do what is right.

Companies must acknowledge that they have a common good to protect there local community, improve employee relations and promote informational press to the public. While back in 1986, Griffin was directing his argument towards ethics in accounting but it is also true today in Business Intelligence. Government regulations are not changing fast enough to cover all the changes in technology that bombards users on day to day bases. It is up to corporations to create a code of ethics, and to persistently be receptive to the needs of the public being served.

Everyday in BI management professionals may be at risk of making unethical practices in there decisions that regards the consumer, business and/or other employees data. Ethics is a touchy subject, there is always going to be controversy on how companies choose to handle business decisions. There is no definite decision to make when it comes to ethical decisions. While sometimes it may involve illegal practices, other times it is just a decision that needs to be made in a company to promote a better way of life for all.

An example of an ethical decision would be a manager of a BI system that chooses to use cheaper data in his/her data mining activities to save money. The data he/she chooses to implement involves personal credit score reports. The cheaper data sets have a 20% possibility of being incorrect. The manager did not see it as being an unethical decision when it was made, just a way to continue to generate close-to-accurate reports and save money.  The impacting decision on 20% of the company’s customers may have different results as more people are turned down for credit because inaccurate reports. It is not a crime to have implemented the inaccurate data sets but it may seem as an unethical practice to others. While it is important for managers to be able to make their own decisions, this example decision being made should have involved more managers since it affected the whole business. The manager’s choice could bankrupt the company as user start to leave their business for more accurate competitive companies. As the example points out, sometimes there is no really clear answer to wither an issue involves an ethical or legal choice and each situation can be different. Trying to make decisions based on individuals’ beliefs when dealing with a company can amount to intellectual stalls and trying to come to a decision can be expensive and time consuming.

Today’s society has come to the point where there are more solutions to problems than ever before. What once was impossible can now be accomplished through the use of BI and other technology similar to BI. It is not going to stop; technology is going to keep advancing. What seems improbable now may be common in the near future. Because of business globalization, there is also a larger separation between companies and customers, companies and competitors than there was when everything was done locally in the past. Larger separation between companies and the consumer has resulted in unethical and sometimes illegal business decisions like data theft. Because of all the technology used in big businesses, and resulting exposure to unethical practices by some of the larger corporations like Enron, there is growing anxiety of large companies to be free of unethical practices. Additionally the general trust level of users has eroded to the point were trust really has to be earned. Users are very aware of cases of identity information being lost to theft as well as other case examples in the media. Users have taken up with the attitude of show me or prove to me that they are safe, that there information is safe or they will not do business.

IT Personnel in Ethics

It is so easy for BI managers to sit behind there desk and manage the data on a day to day business thinking that ethical practices do not concern them. That is not the correct attitude to have. Everyone employed in the information technology field has an obligation to be part of company ethical policies and practices. It is not just about creating schemas and data models, as IT managers they have more of an ethical decision to make than there employers. The BI manager knows more about the emerging technology, and has the best knowledge of a company’s technologies capabilities of what is possible. With all the work that is done in an informational system and what is involved in information delivery and business ethical dilemmas.

Code of Ethics

Every technologically backed association deals with ethical issues in their own way. The Association for Computing Machinery (ACM) has set some great code of ethics including “Computing professionals have a responsibility to share technical knowledge with the public by encouraging understanding of computing, including the impacts of computer systems and their limitations. This imperative implies an obligation to counter any false views related to computing” (ACM, 1992, para. 3), while most of the code of ethics covers general ethical issues, it also cover leadership and other professional responsibility in information technology and is worth looking up.

PAPA Framework

PAPA is an acronym for privacy, accuracy, property, and accessibility. A framework proposed by Richard Mason as the four ethical issues of the information age. He proposed this framework 25 years ago in 1986. To date it is still acknowledge as the four subjects of ethics in information technology and covers ethics in BI as more and more data is extracted, transformed and loaded into data warehouse silos. A lot of are private information is handle with BI in Customer Relationship Management (CRM) systems like Amazons customer web portal. While Amazon is making web application business services for users better and geared towards individual use ,it also demands that some of your private information is given in return for the CRM to accurately predict what you may need and want. Elements of privacy should contain a notice of what data is being collected, how it is being used, a option to participate of not, security measures to protect from data misuse, the ability to access your person information to review and correct and steps are assigned to enforce set policies. On opposing side of privacy is the need to create security, any inadequate security measures can be viewed as carelessness also while the option to participate in the data collection is an option, choosing to not participate usually means that the company will also not provide their services to you.

Accuracy Data Mining (DM) and BI systems is very costly and the percentage of accurate data is a business decision. Some companies can ethically choose less accurate data and still maintain a competitive edge, and supply the users with their services while other systems like a Hospital Information System (HIS) cannot afford to reduce accuracy when a persons life in hanging on the line. When it comes down to who is responsible for the accuracy of the data, executives may set business processes for guidelines but the main responsibility stills falls to the BI manager to be able to understand their BI database and also for when new data must be integrated. Executives do not care how the analytics works, just that they are presented with accurate reports and/or dashboards. The whole reliability and integrity of a BI system eventually is placed on the personnel who can transfers the sea of technology used, not the end users. When there is an ethical situation within the company who will be help liable, the executive who did not know the technology or the BI manager in charge of data accuracy?

Accessibility of data in the past was only privileged to a significantly smaller group of user than now. With the technology explosion of BI and web interfaces, anyone with a smart phone, computer, laptop or PDA can gain access DM information.  The technology gap, also known as the digital divide, is growing smaller. Information is power, users have a right to be on a level playing field, we have a morale obligation to provide skills to understand and manage, understand, and access information throughout the world so that users are on a level playing field when it comes down to access od data that provides basic survival information, so a larger technology gap is not created based on poverty, sex, age, or race. While sharing data freely is a goal to help individuals, there is a limit to what can be shared among business partners, customers and competitors yet they should also have the right to come to the same results using technology.

Ethical Issues in BI

While many ethical issue are obscure and hard to notice at the surface there is one a number one concern brought up by most users and according to Hackathorn (2005), the ethical issue in BI that is known by most is the involuntary release of personal information that has lead to identity theft. The theft of personal information like social security numbers, birthdates, and credit card numbers has allowed for technology skilled criminals to possibly walk away with billions of dollars in innocent victims’ money nationally.

Organization need to be accountable for financial data. The U.S. has required financial accountability through regulations like the Sarbanes-Oxley (SOX) of 2002. Yet according to Wallice (2011), the main focus of SOX is to measure internal effectiveness of business controls and does not explicitly address IT. Because of the lack of security for IT in SOX, ISO 17700, the International Standard for the Code and Practice for Informational Security Management is being executed by companies as a framework for maintaining informational security to protect information systems from unauthorized admission, usage, modification, and destruction.

The pressing issue of homeland security and the U.S. patriot Act after the attack on the World Trade Center in New York, left the Government with a strong ability to analyze anyone in the United States as a threat by collecting almost any type of data that they wish including financial activities and how they may be related to terrorism.

Technology is being implemented at airports in order to fight terrorism also. The Transport Security Administration (TSA), according to Worthen (2006), is continuously conducting test with different data mining techniques in order to find the most effective way of weeding out terrorist so that they never gain access to be airlines again. The lack of an almost never ending budget and a lack of a well defined scope allow the TSA to try newer technologies in the name of security, compared to other sectors of business. After 9/11 the Computer Automated Passenger Pre-screening (CAPPS) system that used consumers’, names, credit card information, and address to screen for criminals was change to CAPPS II. CAPPS II combined previous technology of its predecessor with information purchased from data stores run by ChoicePoint and LexisNexis.  CAPPS was eventually replaced with a newer system called Secure Flight that shares the same process of combining passenger data with information purchased from commercial data providers. Over $125 million has been spent in the name of homeland security just in the first 5 years after 9/11.

Framework for Solving Ethical Dilemmas

The ability to solve any ethical problem is to first be aware that there is an ethical situation. Try to be open and honest about the situation while at the same time you need to avoid discussions that could magnify the problem. Try to make the subject of ethics in the work place an acceptable activity. The next step is to thoroughly research the ethical problem and at the same time stay focused on the problem at hand and not try to solve the greater issues, if it is necessary for a person to solve the greater ethical issues that do not impact the company then it should be done on their own personal time. Once all research has been done on the subject and you are able to gain a better understanding to the root of the problem you need to come to a decision on what should be done to fix the ethical problem. Once you have made the proper decision make sure that it is properly documented for you and future employees can learn from it. Solving ethical solutions is the same as solving any decision making process effectively and can be broken down into 6 simple steps: Identify the decision, get the facts, develop alternatives, rate each alternative, make the decision and implement the decision. Make sure to be clear about your actions, if you cannot come to a valuable solution on your own consider hiring someone who can.

Benefits of Ethics in IT

Employers may see that “Although data are mixed, numerous studies in the field of computer ethics support the hypothesis that a written and clearly transmitted code of ethics is a strong influence on employee behavior when an ethical decision is involved” (Computer Ethics, n.d.). Companies that can change there thinking to become more ethical will also beat government regulations while implementing ethical solutions at the companies own affordable base without having to hurry up and match such regulations and will save themselves from the costs of future fines and fees for data misuses in their BI system. If a company is well know for being able to protect the companies BI systems not only from security hacks but also from unethical practices, that company will most likely have the competitive advantage over their rivals and companies can align the business processes of their BI better to cover the broader strategy. The main reason is to gain trust of your products and services and the ability to get a good night sleep knowing you have not cause financial or emotional harm to others.

 

Conclusion

Governments cannot change laws fast enough to protect ethical problems that are arising from new technology. It is in the best interest of companies to be proactive when dealing with ethical situations within there companies IT department. IT personnel do have a role to play in keeping BI systems protected and ethical. IT personnel know the system better than anyone else in the organization and have a responsibility to help keep the data safe. A good guideline to follow when covering ethics in data is PAPA and while you may not want to discuss ethics, a company can benefit from being ethical and choices should be made, doing nothing is always a choice but it is a poor choice when the stack of a company’s reputation is on the line.


References

ACM Council. (October, 1992). Code of Ethics. In Association for Computing Machinery. Retrieved April, 2011, from http://www.acm.org/about/code-of-ethics.

Computer Ethics – Computer Ethics In The Workplace – Ethical, Companies, Company, Organizations, Norms, and Employees, (n.d.). retrieved April 25, 2011, from http://ecommerce.hostip.info/pages/243/Computer-Ethics-COMPUTER-ETHICS-IN-WORKPLACE.html#ixzz1KajkwiJr

Griffin, Charles H.. (1962). The Practical Philosophy of Prefessional Ethics. Journal of Accountancy (pre-1986), 113(000005), 92.  Retrieved April 24, 2011, from ABI/INFORM Global. (Document ID: 83270709).

Hackathorn, Richard. (August 2003). Ethics of Business Intelligence: A Practical Treatment Retrieved April, 2011, from www.bolder.com/pubs/TDWI200308-BI%20Ethics%20v5.pdf

Hackathorn, Richard. (September 2005). Ethics in business intelligence. In Bolder. Retrieved April, 2011, from www.bolder.com/pubs/TD-BIEthics.pdf.

Peslak, Alan R.. (2006). PAPA REVISITED: A CURRENT EMPIRICAL STUDY OF THE MASON FRAMEWORK. The Journal of Computer Information Systems, 46(3), 117-123.  Retrieved April 25, 2011, from ABI/INFORM Global. (Document ID: 1038730691).

Wallace, L., Lin, H., & Cefaratti, M.. (2011). Information Security and Sarbanes-Oxley Compliance: An Exploratory Study. Journal of Information Systems, 25(1), 185-211.  Retrieved April 25, 2011, from ABI/INFORM Global. (Document ID: 2298740021).

Develope a Web Based CMS Using PHP

Download this File Here

Abstract
The Content Management System (CMS) is a web based application using a Linux Server,
Apache Web-server, MySQL Database, and PHP Programming Language (LAMP). The
objective of managing users, and information in any given network environment can only be
hindered by the creativity of an information technology professional and not by technology. The
main objective of this thesis is to develop the early development steps of a LAMP software bundleCMS. By creating the
building blocks for developing, and taking into consideration basic methods for creating the core
platform of a CMS for further development. All information gathered, and experience gained will
assist with developing and offering my own personal e-commerce business solutions in the future
and to obtain additional business and practical knowledge in an open source software and ecommerce.
Continue reading

Enhance VIM

While VI has been used for years, in current Linus & iOS systems a lot of people have resorted to using Vim.

There are a lot of tricks to Vim that can make it easier to use. Some may not be active by default and I wanted a chance to introduce some of them to you. One in general that I have found very useful.

set background=dark

Lets start by making sure that you have the packaged “vim-enhanced” installed on your system. Use one of these steps from a terminal prompt to install it.

Mint/Ubuntu (Debian) sudo apt-get install vim-enhanced

Fedora (redhat, centOS) sudo yum -y install vim-enhanced

OpenSuse (Suse) sudo zypper install  vim-enhanced 

If one of these does not work or your operating systems is different, there are several other options for installing packages. Try the GUI option as well.

The best way to modify settings in Vim is to have the .vimrc file (Vim Run Command file) in your home directory.

cp etc/vimrc  ~/.vimrc

Open the file,

vim ~/.vimrc

From here you can fix all sorts of text editor effects in Vim.

Just remove the ” from the comment in order to use it.

The one that made a difference for me was.

set background=dark

This is a GREAT setting if you are working in Vim and have a dark background.

It will change what you see from this:

Vim background not set.

“set background=dark

To this:

Vim

set background=dark

set background=dark

Making it very easy to see commenting, now there are several ways to change the config file and in the last image you can read some of the ones in the file we copied (cp) over.

Besides being able to read things better know I also like it when the cursor is in t he same place from the last time I was in the file. Another on on the top of the list is the ability to scroll the cursor with me mouse.

Play around with the file, you can always delete it and start from the beginning again if you mess it up to bad.[/fusion_builder_column][/fusion_builder_row][/fusion_builder_container]

404 Webpage Fix for WordPress and Host Gator

404If you are getting this error, use this fix but make sure to shut off any WP applications that may be causing it! Not to go into specific details cause it is a pain in the neck but certain applications that can-rewrite urls for different browsing experiences are really messing up within WP.  Selectively de-activate apps and wait to see if it works.

I cannot remember the name of the app that was doing it  or I would tell you.

This is a quick fix i found that worked from yongee.hubpages.com and just wanted to make sure that the information gets spread around and to acknowledge that it does work.

Modify your .htaccess file, or rename it .htaccessOLD and start from scratch with a new .htaccess file.

    # BEGIN WordPress

    ErrorDocument 404 /index.php?error=404
    RewriteEngine On
    RewriteBase /
    RewriteCond %{REQUEST_FILENAME} !-f
    RewriteCond %{REQUEST_FILENAME} !-d
    RewriteRule . /index.php [fusion_builder_container hundred_percent="yes" overflow="visible"][fusion_builder_row][fusion_builder_column type="1_1" background_position="left top" background_color="" border_size="" border_color="" border_style="solid" spacing="yes" background_image="" background_repeat="no-repeat" padding="" margin_top="0px" margin_bottom="0px" class="" id="" animation_type="" animation_speed="0.3" animation_direction="left" hide_on_mobile="no" center_content="no" min_height="none"][L]

    # END 

   ## This kept giving me 404 ERRORS with WORDPRESS and HOSTGATOR GRRRRRRR ###
   ##WordPressRewriteEngine On
   ##RewriteCond %{HTTP:X-WAP-PROFILE} !^$ [OR]
   ##RewriteCond %{HTTP_ACCEPT} application/vnd.wap.xhtml\+xml [NC,OR]
   ##RewriteCond %{HTTP_ACCEPT} text/vnd.wap.wml [NC]

This has only happened for me with Host Gator. Nothing wrong with HG, they have the best support I have worked with.[/fusion_builder_column][/fusion_builder_row][/fusion_builder_container]

Digital Forensics

 forensics              Reviewing the concept of anti-forensics, which can be described as being:  “…more than technology. It is an approach to criminal hacking that can be summed up like this: Make it hard for them to find you and impossible for them to prove they found you” (Berinato, 2007).  The ultimate fear is that the rise of anti-forensics tools and techniques could make any data collected suspect, and that it jeopardizes the validity of any forensic investigation (or at least makes them so cost-prohibitive that they will seldom be feasible).  Throughout this paper we will look at what these tools and techniques are – from new developments in the field intended to conceal illegal activity to traditional anti-forensic methods to wipe data when old equipment is sold or no longer needed.  We will likewise examine the potential impact to the future of forensic investigations, as this could make the probability of a conviction extremely low.

Case studies

     After painstakingly searching several sites in the attempts to find documentation of successful anti-forensics stories and tools that were used, attempt came up pretty empty. While there are a few stories that share how people have tried to fool digital forensic experts, the fact is that no one is going to report that they were successful in fooling digital forensic investigators because they want to be able to fool them again in the future. Even the digital forensic investigators are not willing to relinquish case stories on what they found and the conclusions that they were able to come to so that they can stay an expert in their field. Some of the following stories were what I was able to find. If you ever find any interesting stories like explosives rigged into computers, or magnetic doorways, I would be interested to hear about it. Lastly I included information on how anti-forensics could be useful for personal use, in order to keep your personal information safe.

     With the amount of digital forensic cases that have been posted after the initial commencement date of this research paper, suggests that the amount of information that will be available within the next year will be an exponential growth from the amount that is available at present.

     Additionally I have come to the conclusion from reading several discussions and online expert opinions that while EnCase is the chosen digital forensic tool of use to get a broad overview of the file system, it is only one of the primary tools in an arsenal of tools that usually has a few other tools dropped into the mix and only through suggestions of peers and trial and error will you be able to decide what are the best tools for you to use.

     Just like some people to use torrents to collect illegal free music, movies, and books, pedophiles are using the same technology to spread child pornography to other pedophiles. The city of Trenton, N.J. tracked the digital fingerprints of pornographic pictures as they left one person’s computer and followed it to the next IP address and was willing to follow pictures for a total of 27 adults. One of the adults was arrested promptly before the others when officers found out he lived above a daycare facility.

     Out of the 100 state troopers and 3 months of hard work, the time came to collect the computers from the felons and extract the digital forensics necessary to convict the 27 individuals for the federal offense of either creating or having possession of child pornography. The traceable factor was the electronic watermark that was imprinted on each image. Making each image traceable on individual’s computers and also the routes the images would take on the internet. Artifacts were left on computers that were proof that the images were downloaded and viewed even if the images were deleted, just like a fingerprint on a murder weapon, it should be easy to convict each person.

The most anti-forensic material that was used by one of the culprits was heavy duty magnets that were installed in the shoes to erase the hard-drive of incriminating evidence. Yet with all the networking detective work, the magnets in the shoes probably just helped proof his guilt.

     Because the images were shared on a peer-to-peer network, every person involved in the arrest will not only be charged with possession of child pornography but also of distribution of child pornography because most torrent downloads automatically start uploading to other users who request the same data(Fletcher, 2012).

     30 year old Higinio O. Ochoa, a member of the hacker group Cabincr3w an offshoot of anonymous, was arrested after he posted an image of his girlfriend from an iPhone to Twitter. What he neglected to take into account was the GPS tagging EXIF metadata that was imprinted on the image. When the FBI viewed the metadata on the image, it effortlessly pointed to his girlfriend’s house in the outer-Melbourne area. Because of the image, I cannot post the actual image to this research paper but I can tell you that there was a message on it that his girlfriend was displaying, it read, “PwNd by W0rmer & cabinCr3w <3 u B(commented out)’s!”. All EXIF data had been wiped from the photos posted online.

     I was not able to find any current digital forensics tools that would look for coded messages, just encrypted messages. One helpful post I found from a digital forensics expert suggests that by using Unicode escape sequence messages, that you could possibly circumvent most digital forensic tools, unless it is a professional smart enough to check for the. For an example, \u0048 \u0045 \u004c \u004c \u004f , spells out HELLO.

     Fortunately there are people that are trying to close the gap for digital forensic tools lie Pavel Gladyshev of the UCD School of Computer Science and Informatics located in Texas, is working on a project to develop tools that will not only search for raw binary data for keywords but also search for possible character encoding to include ASCII, UTF-8, UTF-16, and UTF-32 that might have escape sequences embedded in it.

Anti-forensics for Your Protection

     Some people might jump to conclusions that by using anti-forensics to protect your information imply that you’re trying to hide illegal information. That is not always the case, sometimes it is useful to use anti-forensic tools in ordinary daily activities to protect against malware that targets devices like smartphones (Storm, 2011). Take for example the mobile forensic solutions offered by the company Cellebrite that are able to extract deleted data from all smartphones and tablets. While most information gleaned is produced from a hardwired connection, it is possible for devices to attach wireless through infrared or Bluetooth signal. The ability to access data remotely from a smart device makes forensic devices dangerous for the general populace because they may be used for criminal activity or spying (Bloomberg 2012).

      Companies like WhisperSystems (www.whispersys.com), make it a little bit harder for government and criminals alike to easily take data from your smart device by providing full disk encryption, network security tools, encrypted backup to the cloud, and selective permissions. Not only will anti-forensics software encrypt you data but it can also encrypt your text messages and voice calls if the other person is using the same software, if they are not it will still encrypt the data on your phone. This protection is not just necessary from a direct attack but also by malware that might disguise itself as an application you really want on you device.

    In the near future, I will be testing mobile digital forensic tools at Ferris State University and will test to see how well at least one of the free anti-forensic tools work during class and plan to come back and add more on forensics and security.

Reference

Berinato, S. (2007, June 8). The rise of anti-forensics. Retrieved from http://www.csoonline.com/article/221208/the-rise-of-anti-forensics

Bloomberg Government, (March, 2012) IPhones to BlackBerrys Cracked by Cops Using Digital Forensics. Cellebrite mobile data secured. Retrieved 4/18/2012. From http://www.cellebrite.com/news-and-events/mobile-data-news/335-iphones-to-blackberrys-cracked-by-cops-using-digital-forensics.html

Fletcher, J. (April, 2012). N.J. investigators track digital ‘fingerprints’ on shared images to nab child pornographers. The republic of Columbus Indiana. Retrieved 4/18/201, from http://www.therepublic.com/view/story/CPT-CHILDPORN_7786030/CPT-CHILDPORN_7786030/

 

Open Source Software in Digital Forensics

The purpose of this research paper is to research information on open source digital forensic tools that are assess-able for free, usually online. To review types of digital forensic tools available and what they do. The basic definition of what open source and digital forensics is will be defined, and how Open Source Software (OSS) digital forensic tools can help accomplished data retrieval. The pros and cons of why OSS should be considering as a viable digital forensic tool-set is also covered.

Digital Forensics and Incident Response and Tools

Digital forensics and Incident Response (DFIR) is the method of investigating and analyzing data information for the purpose of presenting, an ordered report that shows a chain of evidence to find out what happened on a computer and who was responsible, to a court of law. SearchSecurity. (September 2004). DFIR is being more commonly used as more and more people use computers in their daily life, from smart phones, game stations, and laptops. DFIR can help convict anyone of any crime that involved a computer, wither it is prostitution, child pornography or a white collar crime like embezzlement.

DFIR Tools are the free and proprietary applications used by DFIR experts to retrieve the results to hand over to the legal system. They allow investigators the ability to examine the contents of the hard drive without making changes to the data held within. Information that is retrieved can come from deleted files, encrypted, or damaged files SearchSecurity. (September 2004).

Open Source Software

[fusion_builder_container hundred_percent=”yes” overflow=”visible”][fusion_builder_row][fusion_builder_column type=”1_1″ background_position=”left top” background_color=”” border_size=”” border_color=”” border_style=”solid” spacing=”yes” background_image=”” background_repeat=”no-repeat” padding=”” margin_top=”0px” margin_bottom=”0px” class=”” id=”” animation_type=”” animation_speed=”0.3″ animation_direction=”left” hide_on_mobile=”no” center_content=”no” min_height=”none”]

Open Source

Some things are priceless.

Open Source Software (OSS) is a set of practices used to collaborate with software source code that has been made freely available through copywriting laws. It is also commonly known as FOSS (Free Open Source Software), although most OSS is free, not all is but for this research paper I will be covering mostly the free version of OSS. Individuals separated can come from diverse cultural, corporate boundaries, language and other characteristics in order to work together to create complex, non-proprietary software. Software is open sourced when it is free to redistribute, the source code is redistributed with it as well as in compiled mode. The open source licensing was created to make the source code of a program readily available to anyone that requests it. By making the source code available for anyone, it helps in developing stable software because the whole community is able to create changes and redistribute their own version of the software. Open source software protects the original author of the software, does not discriminate in anyway on how it can be used, cannot be specific to a product or software, cannot restrict other software and has to be technology neutral (open source, n.d.). There are several variants of the open source licensing contract that can be reviewed at opensource.org (http://www.opensource.org/licenses/category).

Some of the more widely known open source licenses cover the GNU(Graphic environment of sever Linux desktops), Mozilla (Firefox, Thunderbird), MIT, BSD (like Unix), and Eclipse (Eclipse IDE). Because of the lack of dependency on software vendors, open source software allows the software to transform and morph into potentially anything the users and developers need the software to do. It gives users the freedom to use it when they want, how they want and on their own terms.

Why OSS DFIR Tools

Open Source digital forensic tools addresses specific gap in forensic capabilities of proprietary DFIR tools.  The tools range for analyzing memory dumps, disks, network traces, cell phones, and memory images from game consoles. Besides the fact that some of the tools focus on one specific area of digital forensics of the incident response building, make them invaluable to some investigators who find the complete packaging of some DFIR proprietary tools cumbersome and lacking in some areas.

Financially, companies and governments are always looking for ways to cut budget costs that is the same for DFIR investigators trying to find work. An investigator using DFIR tools can offer a lower price to customers than a person that has to forward the expensive cost of proprietary tools to the customer. Even law enforcement that has an easier time justifying expenses to budget reports in other departments like traffic enforcement, and drug trafficking. Because of the high cost involved in proprietary applications, the follow up cost for updates may be neglected, leaving the software antiquated and not viable in future investigations.

Legally procedures for finding digital evidence need to be defended in court as being testable, published under peer review, show the possible error rate, and are marginally accepted in the relevant scientific community. Because proprietary tools are closed source and the companies offering the tools do not what to acknowledge mistakes in their software, it makes a case all by itself for choosing OSS digital forensic tools during investigation. OSS allows the source code to be evaluated, tested and error rates to be traced. OSS tools are also greatly accepted by the DFIR community (Carrier, 2002). As Brian Carrier, (Carrier, 2002) reported, “The digital forensic market should not be approached in the same way that other software markets are. The goal of a digital forensic tool should not be market domination by keeping procedural techniques secret.” While Carrier may be a little bias since he developed most of the code in Sleuth Kit, Autopsy, and mac-robber, his experience in digital forensic just proves the fact that it is important to keep OSS DGIR tools in mind.

Counterproductive to this paper, it is important to note that there usually is a larger learning curve when dealing with OSS DFIR tools since some run from command prompts and from Nix (Linux, Unix, BSD) operating systems. Also since they usually focus on one component of DFIR, it requires several different applications that need to be tied together to build a report. Because some of the tools take a lot of time to college and arrange data for a report it is sometimes better to use in a lab then in the field.

Conclusion

While open source digital forensic  tools are abound, and you can take advantage of all of them while  avoiding paying fees for commercial products, there are several good commercial tools that are available also. Because digital forensics is such a vast field of study, it is important to not rely on just one set of tools and to research and test other methods to discover and fight anti-forensics.

Good luck with any future digital forensic test cases you attempt, please make sure that it is done ethically and legally.

More information on OSS DFIR tools can be found at sites like IEEE, open source references at the National Institute of Technology (NIST), the National Software Reference Library (NSRL) form NIST, and government studies and college studies on OSS alternatives in DFIR. Additionally there is a growing amount of information from personal websites and OSS developers on DFIR that may be useful. I totally agree with Schneier (2010), that we, “would encorage everybody to download and learn the tools not just because they can do forensics but because most of them can also be used for other things such as finding things in memory and hard drives that should not be there which many AV tools cannot do and to help put systems back together again.”


Future Note: I plan to compare and contrast some of the more common tools that I will continue to study.References

References

Carrier B, (2012). Slueth Kit. Retrieved 4/15/2012. From http://www.sleuthkit.org/sleuthkit/index.php

Cmihai. (October 2007). UNIX System Administration: Solaris, AIX, HP-UX, Tru64, BSD. Retrieved 4/27/2012, from http://blog.boreas.ro/2007/10/digital-forensic-tools-imaging.html

DFF (n.d.) Open Source Digital Investigation Framework. Retrieved 4/17/2012, from http://www.digital-forensic.org/

Forristal, J., Shipley G.. January 8, 2001. Vulnerability Assessment Scanners. Network Computing. http://www.nwc.com

LinuxLinks (n.d.) 6 of the Best Free Linux Digital Forensics Tools, retrieved 4/5/201. From http://www.linuxlinks.com/article/20110115103656314/DigitalForensics.html

Nikkel, B. (June 2012) Practical Computer Forensics using Open Source tools, retrieved 4/23/2012, from www.digitalforensics.ch/nikkel08.pdf

Open Source. (n.d.) The Open Source Definition. Open Source Initiative. Retrieved from http://www.opensource.org/docs/osd

Open Source Digital Forensics (n.d.) Tools, retrieved 4/5/2012 

Schneier B.. (December 2010), Open Source Digital Forensics. Retrieved 4/6/2012. From

SearchSecurity. (September 2004). computer forensics (cyberforensics). Retrieved 5/4/2012. From http://searchsecurity.techtarget.com/definition/computer-forensics

[/fusion_builder_column][/fusion_builder_row][/fusion_builder_container]