Monday, February 24, 2014

EdTech 541: Relative Advantage of Presentation Software

Using digital presentation software (such as PowerPoint or Google Presentation) improves productivity and instruction.  It is a valuable way of increasing student engagement and memory in the classroom.  It is a means of sharing information that appeals to learners who prefer a mixture of visual and auditory input.  It’s attractive to students because it appeals to their diverse learning styles through multimedia sounds, images, color, action, and design. The messages from well-designed presentations are clear and concise, helping learners to attend to the information in a focused way. Just like children’s book illustrations help enrich a story and bring it to life, presentation software can help tell a “story” and break down the learning experience into digestible chunks.  This brevity of the slides’ content aids in comprehension and fosters an improved commitment to long-term memory.  If used sparingly and appropriately, sounds and motions can also help sustain attention.


Presentation software is easy to use and easy to modify.  This means that students and teachers can create interactive and attractive designs by using professional-quality standard templates/themes or by using their own.  This ensures that teachers and students who are less artistically inclined can still feel confident creating a visually-dense presentation.  The graphic design piece is already in place. Time and frustration in creating other paper-based visual aids, such as charts or posters, can be avoided.  Color-coordinated styles, fonts, and spell-check take away the tedium of some of the low-ordering thinking tasks typically associated with presentations. This allows more time for acquiring and organizing knowledge. In addition, reordering slides in a presentation, adding images, or altering text content can be done instantaneously and without much labor.


The art of presentation is much smoother with this software as well. If teachers or students are presenting their slideshow, it is easy to advance to the next slide and maintain the audience’s attention.  There is less of an opportunity for distractions than there might be with bending over to use overhead transparencies or facing a whiteboard.  A good presentation could also eliminate the need for distracting handouts.  It’s important to note, however, that the presentation itself must not become a distraction.  Good design principles, variety of content, pace, and limited animations/sounds are important to ensure maximum benefit. (For ideas on this see: www.garrreynolds.com, office.microsoft.com, howstuffworks.com, or www.makeuseof.com)


PowerPoint-type software can aid in collaboration by involving every student in the learning and teaching processes. Peer-teaching opportunities abound.  Investigative reporting is a great launching point for this software because the students take charge of their own learning, and there is an authentic “performance” piece at the end in the form of an interactive presentation (which can serve as a formative assessment).  Roblyer and Doering also suggest student-produced activities such as book reports, poetry writing, research or persuasive presentations, sharing “how-to” procedures, or simple concept reviews (p. 132-133).


In addition to being collaborative, presentation software can be quite interactive.  Students can include hyperlinks or embedded videos in their presentations to give examples or provide evidence.  They can link to or embed their peers’ slides in a group presentation as well.  Modern software is quite dynamic and allows multiple avenues for interconnectivity and hypermedia.


Presentation software is not always intended to stand alone (without narration), but is convenient to have on hand for review.  Students who are absent from class can still benefit from watching a PowerPoint presentation at home to get a handle on new information.  The fact that digital presentations are easily embedded into websites means that students can review lectures, demonstrations, project descriptions, tutorials, or portfolios at home by accessing their class website.


Teachers can also save time and energy by preserving their presentations (and building on them to improve a lesson), instead of writing on the whiteboard or giving the same lecture/instructions over and over again. Presentation software makes sharing information a very organized experience.  It provides structure to a lesson.  It also helps annotate or highlight important concepts and terms in a visually-appealing way.  For example, providing print-outs of PowerPoint slides alongside a presentation could support and promote note-taking skills (if that was the objective skill being rehearsed).  Despite being quite structured, presentations are easily modifiable, and therefore flexible to include additions and improvements at any time.


Students also benefit from this software built-in scaffolding that encourages them to think sequentially and segmentally.  It helps them improve their organization skills by requiring them to think through what they are going to say, do, and show in a systematic way. It is a constructive means to an end.

Please review my presentation software integration example by clicking here
(Note: In its current "presentation-mode" form, due to a Google bug perhaps, the hyperlinks are not clickable.  Therefore, please refer to the final slide for URLs of linked websites.)


References


Roblyer, M., & Doering, A. H. (2013). Intergrating Educational Technology into Teaching (Sixth Edition ed.). Upper Saddle River, New Jersey: Pearson Education.

UCF Faculty Center for Teaching & Learning. (2014). Effective Use of PowerPoint. Retrieved from: http://www.fctl.ucf.edu/teachingandlearningresources/Technology/PowerPoint/index.php 

Sunday, February 23, 2014

EdTech 503: Discussion #2



1) Research indicated that the single most important information an instructional designer must consider is prior knowledge (Streamlined ID, p. 53). What are some strategies instructional designers can use to discover prior knowledge. How can educators access this information?

(February 19)
While observations and conversations with learners are convenient and popular ways to gauge students' prior knowledge, they can lead to a more vague and subjective view of what they already know instead of a true objective analysis.  Surveys can be appropriate for a quick overview of general skills, but they must well-written to include different types and depth of knowledge for a specific course, as often it is difficult for a person to articulate an assessment of their own abilities. Concrete evidence of prior knowledge could come in the form of performance tasks or key concepts tests (which could easily be implemented through a multiple-choice quiz, holding up answer cards (yes/no or colors), or by using response clickers).  Examples of these can be found here.  Another objective method that provides a picture of knowledge content as well as organization of knowledge is the use of concept maps, such as those discussed here.  View this example and this one too. These include terms/concepts, the relationships between them, and a classification of those relationships (with linking words).  These should include examples as well as a hierarchy of general to specific terminology.  Although they may be complicated to explain at first, and a bit time consuming for the learner to complete, they provide a good view of misconceptions and level of complexity of their overall understanding of a topic.

I haven't used these types of intentional analytic pre-assessments in my classes before, but their value is clear.  Another appealing measurement tool is the use of Classroom Assessment Techniques (CATs).  They are quick, immediate, and useful for gauging levels of understanding from an entire group of learners (rather than individuals). Although they are often used during instruction, they can be adapted to be used beforehand as well.  Please visit this webpage for examples of CAT activities.  The "Minute Paper" sounds to me to be very similar to the "K" in a KWL chart, in which students have 1-2 minutes to write down everything they know about a topic.  The instructional designer can group/classify the responses into three levels of understanding to derive an overall impression of background knowledge.

I believe conversations and observations with learners about their prior knowledge need to be supplemented with concrete evidence that can be documented.  It's important to take time to construct carefully-worded pre-assessments that shed light on the skills and concepts, types of knowledge, and connections among knowledge for that target topic. Often times these prior assessments look similar to regular formative assessments.

* * * * * * * * *
(February 23)
Caroline, I really appreciate that you brought up the idea about individualized instruction embedded in the ID process. This has been something I've been concerned about as the ID process, at least initially, came across to me as very formulaic and prescriptive.  It seemed like designing an input-output machine.  In the analysis stage, we are gathering a lot of raw data and making many generalizations and conclusions.  But, like you, I feel it's important to remember the individuality of the learner.

Personalized learning involves acknowledging that students will have different sets of prior knowledge before delving into a task, and that the learning experience may have to be varied for them to be successful in achieving the desired objective (even though their product may look different). Assessing prior knowledge is not a one-off task.  It's part of the continual revision and additive process of ID.

Caroline, I really liked your idea about incorporating pre-assessment into the actual lesson plan of an instructionally designed class/activity. This pre-assessment might be an abbreviated version of what the instructional designer went through in initially creating the activity.  Or it could be a sort of informal "streaming" activity to group students by knowledge/ability level (with regard to the specific task objective).
I feel that designing different launching points for instruction based on the results of the pre-assessment activity ensures the plan is inclusive and appropriate for all.  The launching point could be different for different groups of individuals.  It could either help bring students "up to speed" enough to complete the task, or help them expand upon the upcoming task's requirements so that the experience is more enriching and "new".  (An extension would be appropriate if the bulk of the lesson was teaching students how to make presentations with Google Docs and several already know how.)

Another possibility of using different jumping off points embedded in instructional design is a bit more student-directed. Howard Gardner introduced the idea of "entry points" as a way for students to approach a new topic according to one of their interests or intelligences.  In his case, they could enter an activity or topic by choosing: the Narrative “narrational” Window, the Logical/Quantitative “numerical” Window, the Foundational or “existential” Window, the Experiential or “hands-on” Window, or the Aesthetic Window.  (I found an interesting overview with example here, perhaps from another EdTech course?!)  Implementing an activity like this could shed insight into students' prior knowledge and experience while making them motivated and confident in completing the upcoming task.  This type of entry point, or launching point, would clearly work better in some disciplines more than other, but the idea that students choose where/how to enter is another option.  The rest of the instruction could be more streamlined and universal, but always being conscious to value and incorporate students' preferences and intelligences.

Achieving instructional objectives is the target of an ID process, and we know that there are many, many ways to do that. So an approach that plans for student diversity and includes performance flexibility is important.  It's not enough to just do it "behind the scenes" or in a general sense.


2) a) In case study 7, critique the steps Maya took to identify the needs in the case. What are some things she did well? What would you suggest to improve her practice. 
b) In general, is a needs analysis always necessary when planning an instructional design project? Why or why not?

(February 20)
a) Maya was a very empathetic, patient, and reflective thinker when it came to initiating an instructional redesign of Ruth Ann’s 7th grade math course.  I would have loved to have such a helpful colleague at my side at many points in my career! I believe she was a good sounding board and “needs-detective,” however at the same time I can understand how Ruth Ann would have left the exchange feeling a bit unsatisfied and uncertain as to the next steps.
Maya was overall quite thorough in her needs analysis process in that she systematically collected data from various resources and stakeholders. She:
  1. Identified teacher assumptions and beliefs (which included behavior distractions, motivational  struggles, and deficiencies in basic skills).
  2. Noted defining characteristics of students (overall demographics and home life)
  3. Recorded field notes from interviews with students (from two different ability levels and a lower grade level), the teacher (Ruth Ann), and the assistant principal regarding their:
    1. Attitudes and values about the subject and its relevance
    2. Opinions about materials (books, money, technology-use) and methods (collaborative, board-work, problem-solving vs. problem solving)
    3. Thoughts on assessment measures (homework as well as state- and school- dictated)
    4. Noted which methods had been used before which had proven unsuccessful, according to the teacher
    5. Sought points of alignment between materials (textbook) and state standards
I believe Maya could have done a better job of collecting more objective data (through surveys and other numerical/statistical analytic tools (rating scales, frequency count forms, checklists, etc.).  Measurable data would’ve shown patterns, trends, and priorities in a clearer way than conversational interviews. It would help make a more compelling case for the focus of the solution (e.g. which problem was the most pressing or universal?).  Maya could’ve also helped paint a clearer picture in this stage by including a detailed list of the current resources available as well as any constraints that would limit the new plan’s design, budget, and/or schedule.

Maya also set out well in her analysis of the data. First of all, she noted that the results showed a discrepancy between the textbook and the standards.  The analysis also pointed out that Ruth Ann, her “client”, did not really have a clear vision of what her class should be like.  In her mind, what did the students actually need to know, do, and believe?  The state standards weren’t going to teach themselves.  Yes, Ruth Ann wanted an innovative “real-world” math environment, but was scared to leave the sense of security and control that came from drill-and-practice techniques. So what was her priority? Introducing open-ended and complex math problems, including more writing in the problem-solving process, improving overall motivation, incorporating more technology, or bolstering basic math skills? This lack of a focused definition of the problem made defining the performance gap quite difficult, or at best, imprecise.  Therefore the goals of the new teaching plan were not anchored in anything concrete and clear-cut.

I thought it important to note that throughout the analysis process, Maya’s own expectations were also challenged. Just like any teacher, ID’ers have their own assumptions which much be deconstructed in the analysis phase.  In this case there was a cultural context at work in the background of these students’ lives that she had not considered: resistance to education in general.  This observation prompted her to emphasize real-world connections in her plan suggestion going forward.

I believe Maya was a bit too quick to suggest a plan to Ruth Ann, and the plan she rather casually introduced lacked definition.  Integrating a variety of new curricular materials sounded like a promising fix, as it would hopefully address the issues of relevance and motivation.  However, Ruth Ann still had concerns which included alignment to standards, a lack of targeted focus to improve basic math skills, an overemphasis on writing in the math curriculum (which was not motivating to them), lack of teacher training and comfort with the materials and pedagogy, assessment, and parent/community response.  A priority mapping technique may have assisted in connecting these concerns with solutions that were embedded in the plan.   It would have weighed some more heavily than others, which may have helped Ruth Ann feel less overwhelmed.  After all, no one single solution was going to be able to address all the problems she articulated.

Maya’s plan needed a bit more evidence to support its implementation.  It needed to demonstrate how it was, in fact, a solution.  Ruth Ann was clearly having difficulty seeing how the new curricular materials were going to address the gaps in students’ skill-levels.  She was still trusting of the recently-adopted textbook and appeared fearful of diverging from it too readily.  While the discussion was healthy and helpful in that it “broke down” the problem, Maya could have perhaps done a better job of analyzing and prioritizing the pieces before suggesting a new plan in a relatively haphazard way.

b) I, too, believe a Needs Analysis is best practice for a thorough instructional design project.  It’s helpful because it forces the designer and client to come to some agreement about what is currently taking place and what should be taking place.  Only then can the needs be determined.  Coming up with a “what should be” was a difficult phase in the process for Maya and Ruth Ann, and I’m not convinced that they actually successfully bridged the gap.

* * * * * * * * *
(February 23)
What a helpful guided analysis process you shared with us, Mike. Thank you! Like you and many others here, I believe that in this (all-too-familiar) story of Maya and Ruth Ann, the needs analysis was comprised of too many narrative or subjective components. Raw data was missing in the forefront, and well-defined, objective evaluation was lacking in the future. There were so many variables at play contributing to the problem (or in my opinion, problems) that it was difficult to come up with a feasible plan.

I thought you brought up a great suggestion for Maya to seek and define “specific performance improvements (test scores, student reported feelings, etc)”. I feel that Maya offered a well-meaning and perhaps excellent solution to Ruth Ann (try new materials and a new way of teaching). However, it was such a tall order and was left rather open-ended, which means its implementation was likely to be poor. Not only that, but it seemed Ruth Ann had little faith in the prospect of that devised plan. I believe Maya shouldn’t have encouraged her to bite off more than she can chew. It needed to be broken down into pieces, each aligning with a specific performance or behavior. They should have devised a plan of attack that included “baby steps” to help transition from one teaching style, classroom structure, array of materials, and set of expectations to this new one. What Maya was suggesting is a pretty drastic shift in pedagogy. Instead, perhaps she could select and clearly define one performance indicator to tackle each term.

Also, as you rightfully said, Mike, Maya’s understanding of the learning problem was holistic. She was able to ingest a wide variety of components from the atmosphere of the learning context. For example, she seemed to really “get” the students and their attitudinal/motivational problems. However, in my opinion, that’s where the success of this ID initiative sort of trailed off. Maya needed to carefully tease out the specific problems she wanted Ruth Ann to address. I agree with you, Mike, as you suggested “can the problem be defined in tangible, quantitative terms?” Despite all the analysis, I feel strongly that there was little definition of the actual problem. Or rather, there were too many distinct problems lumped together that a one-size-fits-all solution could simply not address. As I said in my initial reply, there was no clear-cut performance gap on which the instructional design solution could anchor itself. Another factor was the lack of quantitative data to support the definition of the problem.

I must say that my approach to this problem would be very similar to Maya’s. Despite my critiquing of various aspects of her game plan, in reality I would probably have gone through a similar thought-process. I would try to gain a general, overall understanding of my students and their issues, and then try something drastic or wildly creative to transform their learning and get them to move beyond their “issues”. I am not inherently systematic in my approaches! However, by reflecting on our readings and various case studies, I am gaining a better appreciation of the scientific and problem-solving approach to instructional design. Focus is key. Data is important. Cause-and-effect needs to be at the heart of decision-making.

Sunday, February 16, 2014

EdTech 541: Instructional Software in the Classroom

Instructional software is a hallmark of 21st century education, and is one way in which classrooms can become “cutting edge”.  Well-designed software can bring concepts to life, motivate learners, offer individual support, and encourage collaborative problem-solving.  Instructional software refers to any type of computer program that is specifically designed to deliver instruction or assist with a topic’s instructional delivery (Roblyer & Doering, 2013, p. 77), and includes drill-and-practice, tutorial, simulation, instructional games, and problem-solving functions. Many types of instructional software are in line with both constructivist and directed teaching styles, although drill-and-practice and tutorial programs are mostly in favor with the latter.  Overall they help students explore topics and encourage them to either generate their own knowledge or rehearse facts and skills.  Instructional software also assists the teacher due to its ability to monitor student performance, provide feedback, and include built-in aids allowing for individualized instruction. It can save time, money, and resources by replacing field trips, teaching materials, worksheets, and even supplemental teaching staff. 

Other benefits include:
Ø      Test preparation
Ø      Memorizing/automatizing low-level skills
Ø      Rehearsing problem-solving procedures
Ø      Supplement homework
Ø      Increased student involvement
Ø      Constant and immediate feedback
Ø      Personalized experiences
Ø      Efficient learning of complex tasks
Ø      Concentrate focus on essential components
Ø      Explore variables, play with time, repeat experiments
Ø      Entertaining and positive learning experiences
Ø      Making real-world connections

Please view my PREZI presentation
Instructional Software in the Classroom

Teachers should ensure that the software in use in their classroom is of a high quality, is in line with objectives, and has clear relatively advantage to traditional teaching materials.  Making a habit of evaluating software is important to ensure our students are learning from programs that are easy to use and have an efficient and effective purpose.  Before introducing a simulation, game, or tutorial in my classroom, I ensure I’ve worked through the program through the eyes of a student.  I don’t necessarily complete a formal review, but I do keep in mind issues of design, ability levels, directions, assessment, ease of use, and technical quality.  These websites suggest additional factors to consider when reviewing educational software for classroom use, and may help to create a more formal and habitual evaluation process and policy:

Evaluation Factors:  Within the Plan for Technology Implementation from Montague Area Public Schools, we can find a valuable set of suggested qualities that could be transformed into a checklist for software evaluation.  Teachers could collaborate to critique particular software in question, and come to consensus about a best product depending on the factors which include content, appropriateness, questioning techniques, motivation, creativity, learner control, objectives, feedback, modifiability, record-keeping, documentation/support, technical quality, start-up, hardware and peripherals.

How to Evaluate Software and Websites: Author Gregg B. Jackson offers tips for personal software evaluation as well as links to reliable evaluation guides and portals.  The criteria include content, ease of use, pedagogical strategies, standards-alignment, and impact evaluation. The resource list includes Children’s Software Review and the International Society for Technology in Education, an extensive searchable database.

In a similar way Warren Buckleitner discusses software examination practices in his 1999 article The State Of Children's Software Evaluation. Here we not only revisit some important questions to ask ourselves while reviewing software, but we also consider different categories of reviewing bodies.  For example, he describes the differences in the effectiveness of reviews from school districts, software/hardware industry, software catalogs, commercial magazines, trade group award programs, universities, independent experts, and non-profit organizations.  He compares their bias, depth, and up-to-date contributions.  The article briefly discusses the history of software evaluation, and concludes with a list of contemporary software reviewing bodies (as of 1999).

It is helpful for teachers to keep on top of new software developments or options by searching through independent company or organization reviews such as those that can be found at EducationWorld or SuperKids.com.  Although these may be limited in scope, they are typically peer-tested and peer-reviewed.  They are good place to start brainstorming when unit planning to get a feel for “what’s out there”.


References

Buckleitner, W. (1999). The State Of Children's Software Evaluation-- Yesterday, Today and in the 21st Century. Retrieved from http://www.childrenssoftware.com/evaluation.html

Jackson, G. B. (2000, May/June). How to Evaluate Educational Software and Websites. TechKnowLogia, 57-58. Retrieved from http://arlington.k12.va.us/cms/lib2/VA01000586/Centricity/Domain/3914/evaluate%20software.pdf

Montague Area Public Schools. (n.d.) Appendix E. - Plan for Technology Implementation. Retrieved from http://www-personal.umich.edu/~sdbest/techplan/maps/App_e.htm

Roblyer, M. D., & Doering, A. H. (2013). Integrating educational technology into teaching. Boston: Pearson/Allyn and Bacon Publishers.


Wednesday, February 12, 2014

EdTech 503: Discussion #1

1) According to the iterative ADDIE Model and the continual improvement streamline principal, Instructional Designers should evaluate each phase of the design model by asking input from subject experts and target audience. As an educator, does this seem feasible in your teaching environment? Why or why not?

(February 12) 
Although it would be ideal for an educator to seek input from SME’s and students at every stage of the instructional design process, I do not believe it is a realistic expectation. Like Caroline mentioned, it often comes down to time.
From my understanding, an instructional designer is distinct from a teacher. The process they go through to create learning tasks is inevitably different from a classroom teacher’s approach because they have time and resources specifically devoted to the design process. Their time and energy is not consumed by additional routine, administrative, and management responsibilities as an educator’s would be. Working relatively “behind the scenes,” an instructional designer has more opportunities to engage in conversations with SME’s and the target audience throughout the process (and more specifically, at the completion of each phase in the model). A teacher is expecting and trusting that an instructional designer has already gone through this process when she selects a prescribed lesson/activity to implement. In the absence of an instructional designer with whom to collaborate or from whom to receive teaching materials, I feel it would be difficult for an educator to consult SME’s throughout each part of a design process. That is, unless either she a) somehow has a lot of extra time, b) is extremely unfamiliar with the subject matter she’s teaching, or c) is a veteran teacher whose main priority is to significantly revamp a lesson/module/course she’s been teaching for a long time. If anything, I think it would be most reasonable that the average teacher would consult a SME during the Design stage only. And in many cases, especially at the elementary level, Internet resources may take the role of SME.
Before this course I had never heard of instructional design (as a theory-based field of study). I have never worked with an instructional design team, and have never had the luxury (as I imagine it would be!) to have one with whom to collaborate in my school. Like many teachers, I believe I’ve been wearing the ID hat (amongst others, like the chap in Caps for Sale) throughout my ten very eclectic years of K-12 teaching. It has been my responsibility all along to take curricular objectives and sort through resources to create learning experiences, keeping in mind theories of educational psychology and “best practices” I’ve adopted along the way. Collaboration opportunities are not readily available, as I’ve always been the only teacher in my subject matter or age group. The design process is not formal or prescriptive, and rapid prototyping has often been the norm, but many elements of ADDIE are in there somewhere! For example, after designing and implementing a learning activity, I rely on my own and my students’ subjective reaction to and objective academic success with the formative and summative assessments I’ve administered. Debriefing isn’t constant, though, as I feel it would be distracting to continually “step outside the box”. Nevertheless my audience’s voice is valued, and I think it would be reasonable that an educator would seek feedback from her students during the Analysis, Implementation, and Evaluation phases only. Obviously, however, she would keep their background knowledge, ability levels, learning styles, and attitudes/interests in consideration throughout all stages of instructional design.
It would be wonderful for teachers to adhere to a systematic and iterative process of analysis, discussion, reflection, and revision at every stage of the design process, but at the end of the day, a large portion of an educator’s job (and time) involves motivating and managing the learners in her charge.

 * * * * * * * * * * * *
(February 16)
Nora and Matt, thanks for encouraging us to think a bit more about the nature of evaluation.  Matt, it helped me come to terms with the "science vs. art" debate how you contrasted "act of evaluation" with "creation of evaluation". Nora, I valued your proposal of evaluation tasks as being a creative endeavor.  I like to see myself falling more on that side of the spectrum.  However, I've been reflecting a lot lately on analytic and scientific evaluation as well, and would like to be more knowledgeable and active in my use of those tools.  (I know this is an area in which I know I need to improve.)

As instructional designers, how are we meant to interpret our students' creative responses? Typically rubrics for open-ended/project-based assessments are valuable tools, but I feel they are hard to derive "scientific" results from. Although a well-designed rubric aligns directly to objectives and desired results, there is inherently going to be a bit more subjectivity and "creative interpretation" on the part of the teacher.  Doesn't this throw off the "control" our our ID experiment? Do the results from a rubric-based evaluation from an Instructionally Designed task have enough merit on their own? Is an activity sound by ADDIE "standards" if we base our evaluation of the task on how well we think the students creatively responded to it?  I suppose this depends on the objectives and the nature of the lesson.  Nonetheless, I would feel uncomfortable basing my Instructional Design analysis on a lesson that was open-ended, "creative", and constructivist in style. Are there enough predictable and measurable parts in it?

At this point, I see the value of ID for directed instruction (which is quite scientific by nature), but wonder if there is a place for it in alignment with other teaching philosophies.


2) As a novice instructional designer, which aspects of developing instruction do you consider to be inherently artistic? Which aspects of developing instruction do you consider inherently scientific? (Brown & Green, 2011, p.18) 
(February 12)
Just like the right-brain/left-brain theory of learning has been debunked, so too have the lines blurred between art and science. Both domains are inquisitive by nature: they seek to answer “What is true? Why does it matter? How can we move society forward?” (Maeda, 2013*) Leonardo de Vinci helped illustrate that they don’t have to exist dichotomously. Investigation and imagination can both lead to meaningful discoveries. As others before have mentioned, people can solve problems with artistic and scientific influences (some potentially weighing more heavily than others). Luckily instructional designers can be systematic as well as creative in all phases of their design process.
Analyze: Mostly scientific: Defining specific problems, needs, and goals; Listing resources; Identifying constraints and priorities; Studying content; Considering prerequisites. Artistic (to a small degree): Coming up with exploratory questions to determine the problem; Describing (“painting a picture of”) learners and their contexts. For the most part this phase is like gathering background research and defining/framing the issue, which is mostly identified with scientific thinking.
Design: Mostly artistic: Creating activities and assessments in alignment with outcomes; Embedding diverse teaching strategies in information/skills transmission; Developing the media and/or specific delivery system for instruction; Collaborating with SME’s. Somewhat scientific: Planning the sequence of activities according to logic, time, resources, and other external factors; Weighing the pros/cons of different activities, media, and assessments; Planning the evaluation to be controlled and strictly correlated to objectives.
Develop: Mostly artistic: Creatively adapting or blending existing materials: Designing prototypes; Making the actual instruction materials (using technology, graphics, etc.); Project management tools (that are subjective or include “soft skills”) Somewhat scientific: Project management tools (objective); Following blueprints designed in the Design stage; Delineating implementation guidelines
Implement: Mostly scientific: Developing precise procedures to deliver instruction according to the prescribed plan; Ensuring materials and conditions are in place (that the “experiment” is controlled); Collecting feedback; Tracking evaluation. Somewhat artistic: Human interaction
Evaluate: Mostly scientific: Analyzing feedback and reviews; Engaging in formative and summative evaluation; Examining teamwork and the success of the ID process; Drawing conclusions. Somewhat artistic: Making revisions—creating alterations without deviating too drastically from the intended design; Observations and evaluations can be reflective or subjective (to some degree)
In general, the “creation” and “development” elements of the process are more open-ended and artistic than the “gathering evidence” and “analysis” components. Nonetheless, the presence of art and science in the design process ensures a problem-solving approach that is balanced (i.e. not too prescriptive/predictable, and not too detached/aesthetic).
*Although unrelated to this discussion’s topic, John Maeda’s article Artists and Scientists: More Alike Than Different brings up interesting connections between the two fields (art & science) in the realm of education.
* * * * * * * * * * * *
(February 16)
Caroline, I completely agree and sympathize with you and your "teacher hat" -- I have one too, and I think it's gonna just have to stay stuck on!  I think breaking down our lesson and unit planning through the Instructional Design process, even if we only adhere to its most general components and sequence, can only make us better instructors.  I don't think we have to "lose" anything in the process.  I hope not, at least, as I think looking at designing an instructional task through the eyes of a teacher is beneficial for making an effective and practical task.  We already know how to take it from "teacher's guide" (ID) mode to execution.  Working through the ADDIE sequence is a great metacognitive exercise in reflecting on best practice...and perhaps the idea is that these focused analytic techniques will become habit.  At any rate, yes, proper Instructional Design is ideal, but it is not feasible that a full-time teacher will be able to engage in both activities to an acceptable degree. Reflecting on the design steps (and trying to incorporate them as much possible), engaging in target audience feedback, and inviting discussions with SME's is commendable effort, I think.

Now with regard to the issue of SME's, this is also a new term for me since delving into this course.  As a (mostly) elementary teacher, I'm wondering what subject matter experts look like for this particular age group?  Are these the same scientists and mathematicians that are consulted for high school and collegiate-level courses? Are SME's really necessary for the younger age groups? And if so, is their purpose quite different?  (Perhaps a math SME is an expert more in HOW students best learn certain math skills rather than being expert "fraction-workers" or "arithmaticians".  But if that is the case, aren't they overlapping with the instructional designer's role?)  Also, teaching six or seven subject matters, with units of study changing so rapidly, I don't think it's feasible that an elementary teacher is going to be consulting SME's every step of the way.  Thanks for letting me know what you think about this (or anyone else, too, for that matter!) I'm still struggling with the role/purpose of SME's in the younger grades.  Thanks!

Thursday, February 6, 2014

Acceptable Use Policies (AUP's)




image source: Jack Zalium (Flickr)
Permitting a student to roam the internet freely is like setting them loose on the Las Vegas Strip at night with a credit card, a map, a camera, a megaphone, and a mask of anonymity.  It could be a recipe for disaster, and provoke legal issues that a school would not want to be a part of.  In addition, this lack of guidelines and boundaries could place the child in danger. For these reasons, it is critical that an educational institution instructs students how to use technology properly, supervises their use of networked equipment, and establishes and publishes an Acceptable Use Policy.
An AUP is a policy that a user must agree to follow in order to be provided with access to a network or technology peripherals.  It serves to remind students that technology exists for an educational purpose, that they are expected to use it respectfully and responsibly (as ambassadors of the school), that they are being monitored, and that consequences exist for infractions. It is important the students understand that an IT teacher is available to address and concerns, questions, or issues that arise pertaining to the policies set forth. If written well, an AUP could be used as a teaching tool. Creating a practical and effective AUP requires research and planning, customization, and school stakeholders’ involvement.
The National Education Association suggests that an effective AUP include six key elements: a preamble, definition section, policy statement, acceptable uses section, unacceptable uses section, and a violations/sanctions section (Education World).  Typically both students and parents sign the document acknowledging restrictions to the network access and releasing the school of responsibility for infractions.
The preamble is an introduction to the purpose and goals of the policy. The definition section defines key words used throughout the document for parent and student comprehension. In my opinion the AUP document should be written in such a way that the definitions of ambiguous words are included in line with any reference to them. Placing them in bulk at the beginning takes away from the flow and central message of the policy. The policy statement lists the services covered by the AUP as well as the means by which students can come to use those services at school. I believe that this section should be brief and general in scope, as it would be inefficient and perhaps impossible to list all technology services offered at the school. The acceptable and unacceptable uses sections are meant to list appropriate and inappropriate behaviors surrounding the use of the machines.  Again, it is advisable to keep these general or at least categorized so as not to overwhelm and discourage the reader.  The violations/sanctions section serves to tell students how to report violations of the policy and what the consequences would be for students’ violations of the agreement.  I would suggest adding a conclusion to the document that lists some ways in which the school tries to keep students safe (providing supervised work periods, search filters and firewalls, virus protection, prevention of downloads of executable programs, etc.)  It should also be clear about the limitations of liability (e.g. the school will not be responsible for damage or harm to persons, files, data, or hardware), as there needs to be a clear separation between students’ behavior on the network and any unnecessary and unfair legal or financial ramifications.
In my research of AUPs, this recommended structure was difficult to find in practice. One exemplar crowdsourced school social media policy from edudemic.com does not include all of those elements, but is very effective in its language and structure.  This document starts off by recognizing and offering examples of how social networking is fun and valuable, but alludes to risks and uncertainties that may spoil its use while in school.  The specific guidelines that follow are grouped under various umbrellas of civic duty and character, similar to rules/expectations you might see in any classroom (e.g. responsibility, respect, etc.).  The language of the guidelines is direct but descriptive.  It is not condescending or threatening, nor is it sugar-coated.  The matter-of-fact details give any student clarity as to proper behaviors on the internet, even going to far as to suggest options for if and when he/she makes a mistake.  It encourages everyone to be part of protecting and representing the school without force or fear.
Netiquette, plagiarism, personal safety, and cyberbullying are all worth mentioning in an Acceptable Use Policy because they directly correspond to many of the harmful or prohibited actions mentioned within the policy.  They give a name and purpose to the behaviors that the AUP is aiming to curtail.  The concrete examples of acceptable and unacceptable use listed at the end of the model AUP also correlate to those four domains and reinforce what proper technology use actually looks like.  This keeps the tone of the document supportive, positive, and constructive.
It is impossible for an AUP to cover all technologies that exist or will exist, which is something that the other example social media acceptable use policy from edudemic.com does a good job of incorporating in its “bylaws”.  Therefore, the rules put forth should be broad enough to include social media, email, hardware, school network storage, and even personally-owned devices.  Obviously some popular or standard technologies will have specific guidelines that need mentioning (such as sharing passwords to access the network, exchanging files through email, or downloading programs). However, for the most part, expectations for all device and social media use are interconnected. In a Techlearning.com article, principal Eric Sheninger points out that in his school’s documentation there are no long, drawn out policies for BYOT, cell phone use, or social media. There is one short document for parents and students to sign that shows their acceptance of responsibility for using all devices properly in all learning environments.
The AUP should act like a guide for behavior, and not a legally binding, convoluted list of fine print.  To that regard, it should use language that all constituents can understand. The language should stem from a perspective of trust and common sense.  The Gwinnett County Acceptable Use of Electronic Media for Students is an example of an AUP that is quite direct and forceful in tone, but loses its audience due to its threatening use of ALL CAPS and reminders of potential state and federal violations.  It would be difficult to use as a learning tool within any elementary classroom context. In fact, it would fill any fourth grader with fear!  On the other hand, the AUP from Pittsylvania County Schools includes a verbose and adult version labeled “School Board Policy” as well as breakdown of graphically-enhanced and publically accessible AUP “highlights” for elementary (as well as middle/high school) students.  This compromise obviously kept all stakeholders within their comfort zone of “coverage” and ensured students had a grasp of the guidelines buried in the legalese of the official document.
In the same Techlearning.com article discussed previously, we learn the importance of including parents, students, teachers, and school leaders in the discussion to create such policies.  There really is no need to bring in policymakers and lawyers when an AUP is meant to represent the voice of the users.  Each school should be allowed flexibility and customization of language and content according to the students in attendance. For example, in the North Arlington Elementary Schools’ AUP, there is a special section devoted to technology programs of special importance to the school. Teleconferencing and Study Island (an internet program) each feature an explanation and an opt-in/opt-out permission section to ensure understanding and comfort with the programs.
Another example of personalization can be found in the Montgomery School Responsible Use policy which puts forth different expectations for each sub-set of grade levels according to media and technology likely to be used.  Each document (PreK-2, 3-5, 6-8) is written in developmentally appropriate language and assumes a certain maturity level of the target audience.  This makes the document something that any child can feel comfortable reading, discussing, and obeying.  What makes this particular document effective is that it begins with a very positive preface that “Montgomery School believes that all students should have access to technology when they act in a responsible, efficient, courteous and legal manner. Internet access and other technologies available to students and teachers, offer a multitude of global resources. Our goal in providing these services is to enhance the educational development of our students.” Instead of beginning with negative vocabulary about prohibited behaviors or consequences, the introduction is assuming the best from its students.  The rest of the agreement is comprised of a series “I promise” statements that help the document read more like a friendly contract.
The Acceptable Use Policies for schools located where I currently work in Bermuda are not part of the public domain.  In fact, to find the policy for my current school I had to dig through both the student and staff handbook.  Sure enough, a list of privileges, restrictions, and consequences “covered the school’s back” and sought to keep students out of trouble, but they are not adequately discussed at school or posted as a reminder in networked areas.  The AUP, as it exists, is simply stuck in paper form in a thick booklet between “uniform code” and the school calendar, blurred from excessive photocopying.  Perhaps it’s time for a revamp.  At the very least, it’s time for a refresher with the students…before they head back on the internet “Las Vegas-style”!
image source: en.wikipedia.org
Education World.  (n.d.). Getting Started on the Internet: Developing an Acceptable Use Policy (AUP). Retrieved from: http://www.educationworld.com/a_curr/curr093.shtml
Gwinnett County. (n.d.) Acceptable Use of Electronic Media for Students (Version 050107). Retrieved from: http://www.gwinnett.k12.ga.us/BerkeleyLakeES/PDF/AUP_web.PDF.
http://www.edudemic.com/school-social-media-policy/
Nielsen, L. (2012). Looking to create a social media or BYOD policy? Look no further. Retrieved from: http://www.techlearning.com/Default.aspx?tabid=67&EntryId=4355
North Arlington Elementary Schools. (2008). Computer/Internet Acceptable Use Policy. Retrieved from: http://www.narlington.k12.nj.us/Files/AUP/Elem_AUP_08-09--compact.pdf.
Pittsylvania County Schools. (n.d.) Computer Technology Acceptable Use Policy. Retrieved from: http://www.pcs.k12.va.us/aup/index.html