1. Risks Assesment
    1. Project risks
    2. Product risks
    3. Business risks
    4. Technology risks:
      1. Version control failures
      2. Technology failure—software doesn’t work; network failure; etc
      3. Incompatible components at integration time
    5. People risks:
      1. Non-appearance of team members;
      2. Lack of skills within a group;
      3. Overestimating people’s abilities.
      4. People Dropping out from the course.
      5. team changes because of course changes.
      6. Conflict of interests during design.
    6. Organisational risks:
      1. Influence of deadlines and work for other second year courses;
      2. Non-communication; confusion; not recording decisions; etc.
      3. Non-appropriate distribution of work
      4. Not assigning appropriate tasks to each member of the group
      5. failures in individual Time management
      6. Bad project time scheduling
      7. conflict between different second year course deadlines and workloads.
    7. Tools risks:
      1. Tools that do not support the project: Software incorrectly configured.
    8. Requirements risks:
      1. Changing requirements—perhaps the most dangerous
      2. Bad design decisions meaning a re-design and thus delay
      3. Non-compliance with your own glossary.
    9. Estimation risks:
      1. Over ambitious designs not fulfilled
      2. Running out of storage.
    10. Avoidance strategies:
    11. Minimisation strategies:
    12. Contingency plans:
  2. Requirements Gathering
    1. Why?
      1. Finding out what the new system should do
      2. learn how people do their work
      3. making sure your app supports there activities
      4. Discovering the functions needed
      5. Evaluating systems – acceptance and usability (feedback)
    2. When?
      1. All the time, be "Agile" by keeping users close
      2. Especially at the beginning, but avoid "waterfalling"
    3. Where?
      1. Questioning
      2. Workplace
    4. Why so hard?
      1. Initial requirements are mostly wrong
      2. Requirements increase the more code you write
      3. Sargeant's rule
    5. Refining requirements
      1. Informal Notes become either:
        1. Glossary
        2. Requirements Document which becomes
          1. Use Cases
        3. Business Rules
    6. Kinds of requirements
      1. Funtional
      2. Non-Functional Requirements
      3. Domain Requirements
      4. User Requirements
      5. System Requirements
    7. Presenting requirements
      1. prioritisable MoSCOW
      2. functional, non-functional and domain with glossary terms highlighted
    8. Causes of requirements change
      1. Experience with using the software
      2. Change in business processes/management direction
      3. Marketing requirements
      4. Technology change
      5. Misunderstanding and general user inconsistency.
    9. Interview guidelines
      1. Try and ask questions in an ordered fashion.
      2. Your questions should move from the general to the specific.
      3. Use feedback from one question to ask another; be prepared to ask follow up questions
      4. Be aware of who you are asking and what they know.
      5. Make sure the question is relevant to the person youre asking.
      6. Ask questions that the person is qualified to answer.
      7. Don’t assume an excessive level of technical understanding.
      8. Phrase your questions clearly and concisely.
      9. Don’t nag users.
      10. Think of asking the same question in different ways at different times in an interview.
      11. Make the intentions of the question clear.
      12. Only ask one question at a time; multiple questions offers an opportunity not to answer.
      13. Avoid simple yes/no answers; you’re trying to get the user to talk. The yes/no can come from post-interview analysis.
      14. Try and avoid vague ”what if” questions Don’t be too open-ended.
      15. Try and make questions unambiguous.
      16. Be clear and precise; use examples to clarify questions.
      17. You need to understand what the user is doing in their current practices.
      18. Ask questions about how the existing practices work/dont work.
      19. Use questions to clarify terminology
      20. Dont let the user make unreasonable demands of the system
      21. Don’t offer the impossible
      22. Don’t ask what the user wants when he or she couldn’t possibly know
      23. Don’t be rude to the client: Don’t accuse them of lying or getting it wrong; phrase your question carefully
      24. Don’t over-use your client’s time
  3. UML
    1. Use Cases
      1. Description
        1. Primary and alternative flows ; entry and exit conditions
        2. Entry conditions (Preconditions)
        3. Exit Conditions (Post-conditions)
        4. Actors involved
      2. Actors
        1. An actor is/are entities that are external to the system that directly perform "use cases". An internal DB, is not an actor.
        2.  make actor’s names actually reflect their role; ”user” is too generic; ”Fred” is too specific.
        3.  Actor names should be specific and unambiguous: ”family tree database” is more informative than ”database”.
        4.  Actors can use the system and can be used by the system
      3. Cases
        1.  Use cases capture goals or actions of actors.
        2.  Make sure UC names contain verbs ( Create; add; modify; delete; search; report; inspect)
        3.  Looking for symmetry can help: ”create” must be matched by ”delete”.
        4.  remember fundamental things: ”a user will read a document; will look at a picture; etc.”
        5. This means avoiding mindless detail—”genealogist presses mouse button”.
        6.  Each UC should be self-contained.
        7.  try and be precise and avoid vagueness. ‘who is doing x’; ’what y is being done'; 'how is z being done?';
        8.  Avoid the passive voice in descriptions; it leaves open the questions as to who is doing what
        9.  Don’t repeat the use case name in the entry condition. eg ”Print document” has "doc exists" not "ready to print"
        10.  Don’t make up UC not in the requirements. ”add person” subsumes ”add father”
        11.  Avoid UC that look like algorithm steps: for example, save tree, enter tree
        12.  A use case diagram should look like a star, not an activity diagram.
      4. Whole thing
        1.  Name the System box and, again, make it informative.
        2.  system should be a black box.
        3.  UC diagrams show what use cases exist, including choice points and parrallelism. but not necessarily how they are related.
        4. Activity diagrams show a flow-chart process of the system.
      5. Common errors
        1.  bad names
        2.  fine grained task
        3.  chaining
        4.  missing uc
        5.  wrong uc
        6.  mis-identified actor
        7.  missing actor
        8.  mis-named actor
        9.  what do the arrows mean: control; data flow; initiating actor
        10.  wrong level of abstraction: Fine at start; big at end
        11.  main flow is normal; alternative catches deviations in AF should still meet
        12. goal(pay by cheque rather than credit card)
        13.  exceptional flow is not meeting goal (card doesn’t work)
        14.  including too much UI
        15.  building data structure
    2. Used to model the DOMAIN
    3. From DOMAIN to DESIGN
      1. Domain classes often inspire design classes
      2. Some domain classes represent entities outside the software (e.g. actors).
      3. Extra design classes – Pure Fabrications - are often added, e.g. to represent collections.
      4. The key skill required is assigning responsibilities to (software) classes.
    4. Used to model the SPECIFICATION & DESIGN
      1. Design Modelling
        1. more detailed than domain class diagrams
        2. include operations as well as attributes
        3. include software-oriented things such as types and visibilities.
        4. deal with a small number of classes at once, since each is described in more detail.
      2. Extra UML features
        1. Types. Shown after the name, e.g. name:String
        2. Visibilities + (public) – (private)
        3. Parameters on operations e.g. addOption(option: MCQOption)
        4. Static attributes and operations e.g. inputFromFile(filename:String):Question
        5. Constructors: Student(name, ID)
        6. Types and visibilities are still optional
    5. Used to document ACTUAL CODE
  4. Software processes
    1. Waterfall Development
      1. Development
        1. Requirements gathering
        2. Systems analysis
        3. Design
        4. Coding
        5. Testing
        6. Maintenance
      2. Motivation
        1. Later a problem is found, the more expensive it costs to fix
        2. Well defined milestones/deliverables make budgeting/planning easier
        3. Well documented, problems clearly identified
        4. Promotes specilisation because of layers
      3. Common problems
        1. Deliverables offen late or fudged (blank tape trick)
        2. Analysis Paralysis offen occurs
        3. Specialisation can cause poor communication.
        4. The system delivered does not meet the users’ needs
        5. “maintenance” phase is often prolonged and traumatic.
        6. project is usually grossly late and over-budget.
      4. FAIL because:
        1. you won’t get the requirements right the first time
    2. Agile methods
      1. Ceremony is absolute minimum and
      2. respond in an “agile” way to changing requirements
      3. Iterative – consist of short cycles where part of the software is produced.
      4. Frequent interactions with the customer.
      5. Strong emphasis on testing
      6. Done in small “self-organising” teams with little specialisation or explicit leadership
      7. Best known are Extreme Programming (XP) and SCRUM.
      8. pros
        1. cope much better with requirements change than waterfall
        2. code quality should be good
      9. cons
        1. requires competence and confidence from all developers
        2. long-term planning may be difficult
        3. lack of documentation may cause problems later
        4. probably not suitable for large projects
      10. Extreme programming (XP)
        1. Requirements are determined from “user stories” (similar to informal use cases)
        2. Customer representative always available
        3. Unit tests are written before the code to be tested
        4. Programming in pairs – one codes the other reviews, swapping frequently.
        5. Strong emphasis on simplicity of design.
        6. “Refactor whenever and wherever possible”
      11. The Unified Process (UP)
        1. Iterative
          1. Timeboxed, typically 4-6 weeks.
          2. Each iteration produces a production version of a subset of the system
        2. Phases
          1. Inception.
          2. Elaboration.
          3. Construction.
          4. Transition.
        3. Disiplins
          1. Requirements gathering (WS 1 and 2)
          2. domain modelling (WS3)
          3. class design (WS4)
          4. coding, testing etc
        4. Artefacts
          1. UML diagrams
          2. reports
          3. code
          4. other deliverables
    3. Incomplete requirements.
    4. Lack of user involvement.
    5. Lack of resources.
    6. Unrealistic expectations.
    7. Lack of executive support.
    8. Changing requirements & specifications.
    9. Lack of planning.
    10. Elimination of need for project.
    11. Lack of IT management.
    12. Technology illiteracy.
  5. Design
    1. Is simple
    2. Is easy to test
    3. Is easy for other developers to understand
    4. Adapts well to changing requirements
    5. (Usually) one which is object-oriented.
    6. Provide guidance on assignment of responsibilities to classes – the core skill in OO design.
    7. Follows “Gang of Four” design patterns
    8. Patterns
      1. be reused.
      2. contain an explanation of when and how it is applicable
      3. Have names, (e.g. Composite) used to communicate rapidly between designers
      4. Also good for passing on wisdom to inexperienced designers.
      5. Follows GRASP
        1. High cohesion/low coupling
          1. High cohesion: a class represents a single well-defined entity
          2. Low coupling: class interacts with as few other classes as reasonably possible.
          3. Polymorphism
          4. Subtype polymorphism (inheritance)
          5. Many beginners misuse/overuse inheritance
          6. “is-a-kind-of” relationship between superclass and subclasses.
          7. DON’T use inheritance to avoid code duplication if the is-a-kind-of test fails
          8. Often, other classes only need to be coupled to the superclass
          9. Operations, e.g. addReview() can be implemented once in the superclass…
          10. More subclasses (e.g. CD) can be added without changing (or even recompiling) existing code.
          11. Some coupling is essential but avoid spaghetti-type links between classes.
      6. M-V separation
        1. Always separate internal data structures (model) from the UI code (view)
        2. Allow each to change without affecting the other
        3. Could also be a MVC where the controller is between M and V
  6. Testing
    1. Safety critical systems
      1. Provides graceful degradation: don’t have to revert to manual control immediately with one computer out.
      2. Gives continuous testing for free – each discrepancy reveals a bug! system will become be extremely reliable
      3. two computers with different software written by different teams. In theory both will fail at the same time
      4. But if there is a discrepancy you don’t know which one’s wrong – so you need a third computer and take a majority vote: triple redudancy.
      5. Process
    2. Basic testing techniques
      1. Traditional solution: have a separate testing team who are as nasty to the software as possible
      2. Agile solution – write the tests before the code – also helps to clarify requirements.
      3. Equivalence partitioning: Focus on space BOUNDARIES instead of all values, since exhaustive testing is impossible
      4. Black box
        1. Means tests will be written without preconceptions about how the code works.
        2. If the code is changed, the same tests are still valid.
      5. White box
        1. Allows more tests to be done
        2. Allows tester to apply pressure to those places which look most likely to break.
    3. Kinds of testing
      1. Unit testing
        1. Relatively simple, but the class you’re testing will usually rely on other classes.
        2. The search space is generally well defined so techniques like EP ad BVA are most useful here.
        3. Often possible to be systematic and reasonably confident that a single class is bug-free.
        4. In Java, often done with the JUnit testing framework.
      2. Integration testing
        1. Harder to define than unit testing; shape of testing space is less obvious.
        2. Check that the use cases can be performed without problems.
      3. Regression
      4. Smoke testing
      5. System testing
        1. This will generally be a lot more varied than the context in which it was developed.
        2. May involve different hardware, operating systems, performance issues etc.
        3. Need to check the documentation and procedures as well as the code.
      6. Alpha testing
      7. Beta testing
      8. Acceptance tesing
        1. In general, users don’t really know in advance what they want (the “waterfall fallacy”).
        2. Who within the customer organisation defines the spec? e.g. Managers and end users will have different views.
        3. Fixating on passing the acceptance test could result in serious problems being missed.
    4. Bugs
      1. Emburys's Law
        1. Applies to all non-trivial(>10k lines) projects
        2. Some components of SW will be bug-free
      2. Kinds of bugs
        1. Causes crashes/freezes
        2. Fail to provide functionally the user needs
        3. Failure to conform non-function requirements
        4. Inappropriate UI design
        5. Documentation/training bugs
        6. Requirement bugs
      3. Help against bugs
        1. Testing
        2. Code review
        3. Refactoring
          1. If it isn't working properly, refactor.
          2. Split classes, move methods, replace algo's etc
          3. Topic
        4. Frequent contact with users and stakeholders
        5. Risk-driven development
        6. Experience
      4. Bug density factors
        1. On average, 3-5 bugs per 100 lines
        2. concurrent, reactive systems are more diffcult than sequential transformations systems
        3. Programming language: Java < C < Perl
        4. Programmer competence and experience
  7. User Interface
    1. Usability Dimensions
    2. Basics
      1. Allowing users to achieve a goal with efficiency, effectiveness and satisfaction
      2. Utility is the functionality of a system
      3. Utility without usability, but not vice versa
      4. Have paradigms of good usability, e.g. GUI
    3. When
      1. Often the last thing and not enough time
      2. Can design a UI as soon as the functionality is known
      3. Early prototyping; keeping it Agile
      4. Also helps force requirements questions
    4. WIMPS
      1. The user interface widgets that you commonly use
      2. There are many styles for deploying these widgets in a UI
      3. Some are not based on WIMPS
      4. The user interface designer’s skill is choosing a design that affords the functionality
      5. A different skill from programming
      6. Some people would claim disjointness in the two skill sets
    5. Principles
      1. Visibility of system status System should always keep users informed
      2. Match between system and the real world System should speak the user's language
      3. System functions chosen by mistake need a clear 'emergency exit'
      4. Consistency and standards Avoid ambiguity
      5. Error prevention
      6. Recognition rather than recall
      7. Flexibility and efficiency of use
      8. Aesthetic and minimalist design
      9. recognize, diagnose and recover from errors
      10. Help and documentation
    6. The Human
      1. Has mental models of goals and intentions
      2. The human recieves information; forms intentions and executes actions with tools in the world
      3. The human “information processor” has limitations
      4. Brilliant at association and inference
      5. Limited memory: The famous 7 plus or minus two
      6. Physical limitations – finger span, tiredness
    7. Cycle of Execution and Evaluation
    8. The Computer
      1. Very fast
      2. Good memory
      3. Good at long, repetitive task (at which humans are poor)
      4. Rubbish at inference and association
    9. HCI
      1. Presentation
      2. Observation
      3. Articulation
      4. Performance
      5. Framework
        1. Ergonomics
        2. Dialogue Design
        3. Rendering State
    10. Styles
      1. Wizard Interface
        1. Defined start point
        2. Restricted set of known steps
        3. Set of known options
        4. Ability to move back and forth and change options
        5. One choice affects what is offered for another
        6. Typical of an installation process
      2. Form Fill-in
        1. Again, set of known step and known options
        2. No defined start point
        3. I might want to start anywhere
        4. Choosing Specifying a package holiday: Start point; end point; dates; hotels; tours; etc.
        5. I might wish to start at any point
        6. One choice affects another
      3. Direct Manipulation Interface
        1. Humans work by manipulating things both mentally and physically
        2. A user reaches for what he/she wants, uses and puts it down again
        3. Humans also wish to see the results of their actions
        4. Dragging a file to the printer; putting files in the bin; making words bold; drawing lines etc.
        5. “hey you do that”
        6. Lots of functionality available all the time
      4. Command line interface
        1. Typicall not for everyday users
        2. Very good for mass action
        3. “mv *.txt ../other-folder”
        4. Very good for constructing arbitrary, complex actions
        5. Often very fast for the power user
      5. Batch Processing
        1. A set of repeated steps
        2. Need to be run repeatedly in the same way
        3. “A batch” of commands
        4. Don’t necessarily need to interact as the process runs
        5. Bbut do need feedback like logs
        6. Still an interaction style
    11. Ergonomics of a typical GUI
      1. Mini Modes
        1. Long-term modes modes
        2. Short term “spring-loaded” modes
        3. Alert modes modes
        4. Modes that emulate a real-life situation that is itself modal
        5. Modes that change the attributes of something
        6. Modes that stop functionality, as in error conditions
        7. There should be a clear indicator of the current mod
        8. It should be easy to change modes
      2. Learnability
        1. Ease to learn; easy to remember
        2. return after long break and know how to use the tool
        3. return after a short break (cup of tea) and apprehend state
        4. Consistent application design
        5. Closeness to user’s model of the task helps mapping
        6. Most users don’t read manuals
        7. Users prefer to learn by playing or exploring
        8. Therefore good presentation of model to user is vital
        9. “Walk up and use”
        10. Transferable skills
      3. Recall vs Recognition
      4. Predictability
      5. Consistency
        1. External consistency: Consistency with task
        2. Ergonomic issues
        3. Internal consistency: Arbitrary consistency
      6. Reachability
        1. Does the UI map from all parts of task model to system model?
        2. Can the system do things that I cannot ask it to do?
      7. Translating Input
        1. Can the input reach the(and only the) necessary states of the system ?
        2. Match task analysis or activity diagrams to use cases and class/collaboration diagrams
        3. Small cost to user, larger cost in implementation
      8. Ease of Evaluation
        1. Translate state from core to output language
        2. Must preserve state attributes in terms of domain concepts
        3. Output language often limited in expressivity
        4. Video simply limited in size – difficult to see context in documents etc.
    12. Heuristic Evaluation
      1. Simple and natural dialogue
      2. Speak the user's language
      3. Minimize user memory load
      4. Be consistent
      5. Provide feedback
      6. Provide clearly marked exits
      7. Provide short cuts
      8. Good error messages
      9. Prevent errors