Saturday, February 28, 2015

EdTech 523: Multimedia Development and Resource Collection



Discussion Prompt:

Task 1: Please look at Figure 2-4 from Role of Interaction in Online Environments in Toward a Theory of Online Learning.

Write about the multimedia development in your project and how it fits within this diagram. or Tell us about your resource collection and how it will support your learners. Provide examples. Post your initial response by Saturday, February 28. Please provide a comment to at least one of your peers by Tuesday, March 3.

Task 2: Please help us create a professional resource that contains places to get visual and audio representations that help in the teaching of concepts and processes. Submit at least one entry.


My contribution: In my program (Tech Integration Basics for Teachers), I am aiming to create two areas of resources -- one teacher driven and one student driven. The teacher provided resources will be organized by category (which I am using as the six different ISTE standards) as well as by type: website, blog, academic journal, graphic, or video (at least those are the categories I've come up with so far). They will help explain the six ISTE standards from various perspectives and with authentic, real-world context. They will provide more background/description-style information. By organizing them into media/format categories, I am encouraging students with different learning styles to approach the material in a way that appeals and makes sense to them. Break it up also makes it appear less daunting. The student-driven resources are activities they have found or tried that highlight the six ISTE standards. These will serve more like an idea board or recipe book for activity implementation. By making this a communal document, it's my hope that the activity will not only encourage collaboration but will also serve as a useful hub of "tried and true" tech integration activities. It will also help the teacher-students feel more confident with technology because they will have understood, considered, and hopefully adapted/implemented the activity before sharing. Presenting both sets of resources as matrices will keep them organized, approachable, and digestible.

Reply to Peer: Nathan, this is a fantastic idea and one that I would love to propose to the staff in my district down the road. The use of Classroom in general has spread like wildfire quite organically, more than any other tech initiative that's been introduced. I think it's helpful in that it's pretty intuitive to use and makes the workflow so efficient and organized. Are the teachers quite comfortable with the screencasting task? Are they provided time to create these videos and participate in this course during the school day, or is it an extra add-on initiative? I certainly have always valued reciprocal teaching as a great reinforcement of skills with my students, and I think these videos are an example of that. The teachers' confidence will soar and it will undoubtedly promote community-building among the staff. I love this kind of communal resource-building!

Sunday, February 22, 2015

EdTech 505: Week 6 - EPD

Summary of chapter 4 (Starting Point: Evaluator's Program Description) in Boulmetis & Dutwin course text

Please view this PowToon I created.

Discussion posts: 
#1) Evaluation report update

My evaluation report is going to focus on a 4th grade 1:1 Chromebook initiative in our school district. I have decided to narrow down the goals for this program so that this is more coherency and focus when it comes to identifying evaluation questions and, subsequently, survey questions for the data collection process.  Therefore I will intentionally limit the scope of my evaluation.  I understand now that surveys of important stakeholders will be the most important, if not only, data source for my program evaluation.  Therefore I plan to create three different surveys for the principal participants in the program: 4th grade students, 4th grade teachers, and the librarian/media specialist. These won't be pre- and post- data so much as formative reflection.  I hope that is acceptable.  I also wonder about gathering objective, quantitative measures, and look forward to seeing how I can incorporate those into my survey design.

It was helpful to work through the GAP analysis exercise last week, which for me helped establish a more clearly defined context for this program. This was necessary since I dove into it when it was already in the beginnings of the implementation stage (when I began working in the district and even knew about this initiative.)  It's unfortunate that, as evaluator, I was not part of the Needs Assessment process, but discussions I've had with the teachers, district superintendent, and our internal grant committee have shed light on the process by which this program came to be. They all identified the goals, and the teachers themselves discussed some of the program planning that took place.  I am concerned, however, that there simply was not much program planning outside of providing Chromebooks for staff and students.  Not only that, but there was no mention of evaluation at all in the early stages of program planning, which certainly surprised me. Our EPD-like exercise, although not as formal as those in our textbook, was a fruitful and necessary exercise to bring me up to speed and help us figure out the gaps that exist in the program's definition. Until the program activities are clearly defined, it's fruitless to align objectives to evaluation measures.

Luckily there have been no major changes or unexpected problems identified.  Everyone is still on board, and the district superintendent and the internal grant committee are eager to see the results.  The results of this will surely be helpful for future pilot programs, and will also be helpful as we extend the Chromebook program into its second year.

Here are some of my lingering questions/concerns:
1. The “operation” of the program, in many ways, is the teachers generally carrying out all their lessons.  Perhaps this program is unique in that it isn’t really a stand-alone event.  It’s a full-scale integration.  This part may be challenging because it’s hard to determine which procedures/materials are unique to the “program”, and which are just part of the general teaching practice.
2. Should I use the entire school year as the scope, as Sept. is the beginning point of the program?
3. Can I really get enough data from surveys alone? I'd wanted to do anecdotes and observations, but completely understand why they are not advisable (or allowed, for this assignment).
4. How many surveys are necessary/appropriate? Is it okay not to do a pre- and post- test?
5. My report is formative, so I don’t expect any of the objectives to be “met” right off the bat, but for this evaluation process to help improve the program so that they ARE more effectively/efficiently met.  (Is this okay?)  That being said, it will be difficult to evaluate impact, which seems to be more of a summative assessment feature.

#2) AEA- There have been questions about evaluators - Who are they? What are their qualifications? What’s the training? Is there certification? Etc. Go to the American Evaluation Association Web site - http://www.eval.org/- and explore the site. “The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of program evaluation, personnel evaluation, technology, and many other forms of evaluation.” What did you find of interest? What's pertinent to EDTECH 505? Anything surprising? General impressions? Etc.

I was impressed with the social responsibility that the AEA takes in its mission goals and values. The statements of "ethically defensible, culturally responsive evaluation ", "under-represented groups", "global and international evaluation community", "inclusiveness and diversity", "responsive, transparent, and socially responsible association operations" all point to the proactive measures this association takes to engage in community-minded evaluation methods. It was interesting to note that, although it is an American association, over 60 foreign countries are represented in the membership. Members come from various professions and with various interests as well. In our increasingly globalized society, the more perspectives the better!  I was motivated by the End Goal: "Evaluators have the skills and knowledge to be effective, culturally competent, contextually sensitive, and ethical professionals.", and subgoal, "Evaluators use a multicultural lens to engage diverse communities in evaluation effectively and with respect, to promote cultural, geographic, and economic inclusiveness, social justice, and equality."  I'm curious how this looks, though!  These are ambitious goals, and ones that I would certainly be motivated to be a part of.  How does an evaluator or evaluation process begin to tackle these social causes?  At the end of the day, the evaluation must be true to the program objectives, right?... I would be interested in learning more about how AEA evaluators have balanced these aims.
As others have mentioned, the Guiding Principles are helpful indicators of what makes a good evaluator. These are important characteristics for us all to keep in mind as we delve into our own projects.  Although we may not solve the world's problems, we can be systematic, competent, honest, respectful, and responsible for the public welfare as we carry out our task.  The "Learn" section, with its Coffee Breaks webinars and virtual conference, looks like a great hub for gathering more information from the field.
I was interested in the results of the search I conducted for evaluators in my state. I was surprised by the number of results! Education, non-profit organizations, and social work seem to be the main audience for these firms. (In fact, I never had thought of evaluators as belonging to a firm before, but I suppose it makes sense...)  Nevertheless, there aren't too many job openings in Wisconsin right now -- only one as a Program Evaluation professor at UW-Madison.  I wonder if the supply outweighs the demand, or, in general, how often AEA evaluators are called upon.  It's refreshing to look beyond my school windows at these things!

EdTech 523: Collaboration

Discussion Prompt:

In Chapter 8, Promoting Collaborative Learning, Palloff & Pratt state, “Collaborative effort helps learners achieve a deeper level of knowledge generation while moving from independence to interdependence, thus strengthening the foundation of the online learning community” (p.157). Collaboration can strengthen the online learning community by allowing us to learn from one another. This will be more of a sharing discussion to learn about some of the tools others have used in implementing collaborative activities.  Make sure to post your tool to the Online and Blended Teachers Diigo group.

Collaboration in online environments can be challenging. Share some ideas for online collaboration and include at least one tool that might facilitate the process. In your post, discuss potential pitfalls in implementing collaborative activities using Web-based tools.


My contribution: Our school has been fortunate to have received some grant-sponsored telepresence technology (Cisco) that allows for high quality videoconferencing for increased teacher and student collaboration. It's been a slow start getting teachers on board, as the "high-tech" look of the machines and their interface seems more complicated than it really it. It's also been challenging finding schools/classes with whom to collaborate. However, the few virtual meetings we've held with these units this year have been very successful. Similar to Skype or Google hangouts, the telepresence system provides opportunities for students to speak with scientists, authors, and other experts. It also has been a huge hit with students when it comes to cross-cultural connections. In December, a 2nd grade class "met" several times with a 2nd grade English immersion class in Mexico to discuss and learn worldwide holiday traditions. They shared notes, drawings, songs, etc. with each other, exchanging cultural notes and authentic, first-hand experiences. In another teleconference, our 1st grade students took a virtual field trip to the North Pole to complete math activities with other students around the country, led by Mrs. Claus. Extensively expanding the four walls of the classroom has been nothing but motivating, interesting, and rewarding. Teachers as well have benefited from professional development opportunities by which they have collaborated with other districts in our state who wish to learn about our Readers and Writers Workshop model. They meet regularly via the telepresence units to share successes and ask questions.

One issue we are looking to address right now is compatibility and the organization of resources/connections. There really is no central database of which schools have which devices and programs, so it can be challenging to know if the technology is compatible. Along the same lines, it's difficult to know which schools or classes are open to videoconferencing, and if so, what topics or projects they wish to address. Teachers are encouraged to use their PLN connections to build these kinds of relationships and collaboration opportunities. There are a few websites to get started, but it's pretty hard to find alignment. Our school is actually collaborating with a few other schools to build a statewide database to facilitate the connection process. Students are more than happy to jump on the bandwagon, although so far their role has been to participate as a whole class. I'm hopeful that we can work towards students' independently collaboration with outside teachers, experts, or students (even if it's just through Skype or Google Hangouts). I think that this category of tools is extremely powerful for real-time collaboration that can cross borders. The applications of this technology are numerous! I can see how it would be invaluable for a Project-Based Learning environment or to help serve as a "primary source" in a history class, for example. It's also useful as a professional learning tool for teachers who are looking to build a community of learners to improve their craft.

Reply to a Peer: In addition to Skype in the Classroom, I would suggest you check out CAPSpace and CILC for formal, scheduled collaboration opportunities. Some teachers have just found connections through their own network of families and friends, and a few have gone through more of these kinds of organized channels. Does anyone know of any other places/websites that host, sponsor, or schedule collaboration opportunities such as virtual field trips, guest speakers, classrooms looking to connect, etc.?

Sunday, February 15, 2015

EdTech 505: Week 5 - Gap Analysis

Gap Analysis

Assignment for Chapter 3 in B&D text:  Apply Gap Analysis to the program/project you detailed in exercises in previous assignments. Explain Figure 3.1 Program Cycle (p. 51) in terms of the program that you have chosen to evaluate. Locate an Internet site that has something to do with “program cycle,” preferably something to do with education or your particular area/business. Use your discretion as to just how connected it is to “program cycle.” 





Discussion Board Assignment: Discuss the graphic above (from p. 51) and/or Table 3.1 - Program Planning Cycle and Evaluation Activities (pp. 62-3). Pros/cons. Something you disagree with? Where does "monitoring" occur? Anything missing? Bring in any prior experience you might have with such a planning cycle. 

The Program Planning cycle suggested in this diagram is an ideal model of evaluation that is a bit challenging to replicate in the real world (at least in the education world)! The foresight and hindsight views that are required for each part of the process are so, so important, but (in my experience) so hard to coordinate and implement! Unless there is a specialist team-member focusing on best-practice evaluation, it's a challenge to get already worked-to-the-max staff members to organize and execute a strategic, multi-step, integrated evaluation process of a new initiative. Perhaps larger school districts are more familiar with this process and address program evaluation in a more formulaic way, but in my teaching experience (mostly in smaller schools), most decision-making is left to objective narrative, democracy of opinion, and cost-analysis.  That is not to say that I don't see the value in this cycle! I just have never seen it in action from a program perspective.

A big "pro" of this cycle is that it reflects an iterative process.  Our work is never done, and even after the seemingly "final" summative evaluation, there is room for reflection and reconsideration of program alignment to an organization's philosophy and goals (which can also be fluid). Changes can be made before the next implementation. The "wheel" of the planning cycle doesn't stay static -- it's in forward motion as it turns.  Another "pro" is the presence of evaluation procedures and measures in the stages of program planning, implementation and formative evaluation, and summative evaluation. It's integrated throughout, making it integral to every step (except, perhaps, in the "Needs Assessment" when we're concerned with defining goals).

From my understanding, monitoring takes place only in the implementation and formative evaluation stage because it has a bit narrower of scope and is less concerned with objective-alignment and impact.

I believe this model will fit with my proposed evaluation project (1:1 Chromebooks) expect for the fact that I came on board just after the program planning phase and I know no evaluation plan was involved in that stage. I also do not believe a formal Needs Assessment was ever carried out.  Hopefully this will not be too big of an obstacle!  Also the "summative evaluation" piece of the program cycle would most likely not come until several years of running and formatively evaluating the program.  So while I can refer back to program planning and "look forward" to summative evaluation, most of my action as evaluator right now will need to take place in the implementation and formative evaluation stage.

Discussion #2: Monitoring vs. formative evaluation
My response to a peer:
Hi Matt, after reading the chapter I, too was a bit confused by the discrepancy between monitoring and formative evaluation. Trying to distinguish the two as completely different processes seemed a bit pedantic at first. I re-read the section a few times, and what I concluded is that monitoring is an AID to evaluation, but can often go off on its own tangents and serve its own purposes.  "Although some monitoring functions may be useful to an evaluator, much of the data they collect may be useful only to the monitors" (Boulmetis & Dutwin, 2011, p. 57). Monitoring a program may involve more detailed minutiae than is needed for formative evaluation (e.g. looking at demographics of users, wording of documents, discussions/testimonials of users functions, etc.)  Some of this data could be useful to an evaluator, but some may be just for record keeping or compliance, or to see if the program is even progressing according to its logistical plan.  In this way, monitoring tasks have a more short-term, narrow focus. Unlike formative assessment, monitoring doesn't seem concerned about impact or objective alignment to outcomes.  I am not sure if this is correct at all, and it's definitely still a bit hazy, but that's where these ideas are sticking in my brain right now!

Saturday, February 14, 2015

EdTech 523: Best Practices for Teacher and Student

Discussion Prompt:

In chapter six of Building Online Learning Communities, Palloff and Pratt discuss effective teaching and learning in the online classroom. To make teaching and learning successful online, all instructors need to examine best practices for instruction and the role the students play in responding to the changing mode of instruction.
  1. As more educators are transition into the new form of teaching online classes how is the instructors’ role changing from the traditional brick and mortar classroom to online instruction and what do you see as some of the challenges and barriers online instructors may encounter?
  2. Not only is the instructors’ role changing but the students’ role must change also. As the function of the educator begins to change in the online learning process what new roles must students play in response to the changing mode of instruction?
My contribution: Teachers transitioning to the online environment should continue to employ what Collins and Berge (1996) recommend as good "social function" (i.e. community interaction and teamwork), as well as stable and well-thought-out managerial and pedagogical functions (or as Garrison, Anderson, and Archer speak of as "teaching presence"). Being an education facilitator requires many roles that are shared with any teacher (even in a traditional brick-and-mortar classroom) looking to organize and promote a learner-centered environment. Helping students become self-directed, reflective, cooperative, co-creaters of knowledge in the online world is a more involved activity because there is a stronger need for social presence, organization of materials and activities, and technical skills. An online teacher's role requires her to be more creative in her use of teaching strategies and tools to build an inquiry-based, critical thinking "playground" of ideas. An online teacher can't look to vibrant, empassioned live discussions, engaging song-and-dance routines or "wow" lesson openers, or hands-on group tasks with markers and scissors. She must find alternate ways to captivate her learners, get them talking to one another, and guide them in their creation of knowledge. Strategy, purpose, and "above and beyond" presence are a big part of an online teacher's role.

Student roles in an online environment are much more self-directed. They must be very active in their cognitive process. A good online class will require them to be strong critical thinkers in terms of viewing questions from various perspectives and questioning assumptions (their own and others'). Instead of being passive recipients of knowledge, they must collaborate with their peers to offer ideas, suggest resources, challenge assertions, evaluate products, and ask questions. There are typically less guidelines and articulated expectations than in a face-to-face class. This "openness" can be frustrating and overwhelming to many. An instructor must be prepared to assist students who have anxieties or confusion about the learner-centeredness of the course. Students must be organized, open-minded, self-motivated, and flexible when working in an online learning environment. They need to be driven to learn, knowing they will have to do a lot of the "hard work" themselves.

Palloff, R., & Pratt, K. Building online learning communities: Effective strategies for the virtual classroom (2nd ed.). San Francisco, CA: Jossey-Bass, 2007.

Reply to Peer: Karen, that's interesting to hear about your experience with in a student-centered environment that the participants were thrust into unknowingly! I believe it's so important that a teacher explain the reasoning and rationale behind the move to structure and run the class in this way, or else students will be confused. Being used to certain roles and having a developed a schema for what it means to be a "student" (based on their more traditional experiences), it's no wonder they were frustrated by the upheaval and confusion. Perhaps they thought their instructor was being lazy, perhaps they didn't trust their classmates to have and share knowledge, perhaps they didn't like the less-structured format because it required more self-autonomy and leadership... People take classes for different reasons, and having a frank conversation upfront about the "philosophy of learning" surrounding the course can make sure students either feel they are on the same page as the instructor or have enough preliminary context to seek out an alternative learning experience. Making learning goals explicit from the outset might have eliminated some of that backlash.

Kevin, your questions are perfect and definitely necessary as points of reflection. How DO we get students to buy into the community learning process as well as independent (student-centered) learning? That's something I was thinking hard about during Corey's and my lead discussion two weeks ago. It's easy to speak to of the value and merit of adopting that philosophy as teachers, but how do we get our students to do the same? I think intrinsic motivation that comes from contributing to an impassioned discussion or a valued group task is one way to foster the buy-in. Encouraging students to reflect on the community learning process and then, as teacher, commenting on their observations/comments hopefully leads them to being thoughtful, eager participants as well. Building teamwork upfront with "high ropes course" types of learning tasks will get the students to see how integral they to the community and how rewarding it is to be part of a healthy, productive group. The design of the course should hopefully ease them into the "learner centeredness" of the course, such that soon they won't even realize they are grabbing the bull by the horns and paving the way for their own learning (instead of simply digesting information and coasting along). Giving students options, posing open-ended questions, demanding that students do the research... those kinds of teacher behaviors should help scaffold to independence. I'd certainly be interested to hear others' ideas as well, as this is constant struggle area!

Tuesday, February 10, 2015

EdTech 523: Community Building in the Online Course

Discussion Prompt:

Chapter three of Palloff and Pratt in this weeks assigned reading talks about the many different aspects of the human side of online learning that can have an impact on building an online community. This human side of online learning can create challenges and obstacles for the participants of an online course. Chapter 3 identified the following as certain issues that can arise when developing community building in an online course.
  • The need for human contact 
  • Connectedness and coalescence 
  • Shared responsibility, rules and norms 
  • Roles and participation 
  • Shadow issues 
  • Ritual as the psychological expression of community 
  • Spiritual issues 
  • Culture and language 
  • Vulnerability, ethics, and privacy 

For this week's discussion, answer the following prompts.
  1. Select two of these identified issues and discuss the challenges that can occur. 
  2. How can you as the facilitator of an online course help participants overcome these challenges?


  3. My contribution: Shared responsibilities, rules, and norms helps build an online community that it committed to working together with the guidance of jointly established processes and policies. It's important for a community to avoid making too many "rules", which could lead to frustration, power-struggles, or rebellion. Instead, boundaries, expectations, and norms can be discussed to the point of compromise and then explicitly stated so that everyone is on the same page. A leader can guide participants to agree on some standard norms such as openness, honesty, and safety. It may require a consensus-building exercise to determine the schedule or levels of discussion. As a facilitator of a course, we could plan for our initial meeting to involve this kind of decision-making and "verbal" contract-signing. That way everyone knows and has somewhat of a say in the who, what, where, why, and when. This will make the community more productive and purposeful. It will also make everything run with "clarity" and efficiency.

    Roles in an online classroom may not be as clear as they would be in a live setting because people do not really "know" each other as well. By establishing good presence, an instructor can make her roles more clear (facilitator, evaluator, motivator, etc.), however it is important that she goes out of her way to let students know that her roles are farther reaching than in a traditional, teacher-centered classroom. Encouraging team-work and collaboration is a good way to ensure that everyone feels they have a role to play in the online community. These roles should emerge naturally, and should (hopefully) serve the greater good of the community. Giving students voice and choice helps natural talents and personality traits shine through. Relinquishing control over discussions, some assignment tasks, etc. is another way of increasing participation naturally.

    Palloff, R., & Pratt, K. (2007). Building online learning communities: Effective strategies for the virtual classroom (2nd ed.). San Francisco, CA: Jossey-Bass.
    * * * * * * * * * * * *

    Reply to a peer: Marta, I'm so glad you shared that article in your post and brought this interesting angle to the discussion. It's one thing that we all participate in partner/group work and discussion boards as a group of post-graduate professionals. On the whole we have a sensitivity to others' feelings, busy schedules, concepts of fairness, spirits of collaboration or at least compromise, etc. This is a skill we've developed throughout our entire school, work, and family lives. This is a whole new ballgame for many K12 students, however, especially if they are not as familiar with the online environment. Knowing how to speak up, keep teammates on track, consider all sides and opinions, etc. while working on group projects or participating in discussions is a skill that needs plenty of opportunity for rehearsal. No one should be "defaulted" into the "secretary effect". (That article struck so many chords! I shared it with many colleagues, thanks again!)

    I agree with you that shifting roles throughout the course will help students develop these different collaborative skills. How to do that is a little tricky, especially since we don't want our students to participate inauthentic to their learning style. It's a balance between encouraging students to follow their interests, personalities, and innate motivations and convincing them to step outside themselves and experience a different "role" in the learning process, even if it doesn't feel comfortable right away.

    Perhaps having designated "duties" for projects or discussions that need to rotate throughout the course would help ensure that students get to try on new hats here and there. (For example, idea "riffer", summarizer, researcher, scheduler, etc.) The instructor should also schedule opportunities for students to share honest reflection on their feelings about their role and place in the online community (this could be privately with the instructor, or could be a heavily moderated whole-group discussion). It's when students shut down and treat their roles as "givens" or, worse, as chores, that group learning becomes disengaging.

Sunday, February 8, 2015

EdTech 505: Week 4

Creating an outline for my Final Evaluation Report + additional activities.



Pertinent discussion post I contributed from this week:
Here is my eval project outline (with lots of questions!) so far.
My project is to evaluate our district's year-long fourth grade Chromebook 1:1 pilot program (with a view to use this pilot as a means of testing the practicality and value of distributing Chromebooks to all students). Some of the goals of this program are to improve rigor (in terms of collaboration, creativity, and critical thinking skills), support and enhance student responsibility and independence in their learning, offer differentiated curriculum via personalized remediation and enrichment, improve keyboarding skills required by online standardized testing, and advance student learning along the SAMR model.  Another sub-goal of the Chromebook initiative is to improve teaching pedagogy with focus on increased rigor and teacher collaboration (in learning the device, apps, and new teaching practices).
The fourth grade teachers themselves drove this pilot program by submitting an internal grant to a district-wide teacher-led committee. Some of the student outcomes they cited were collaborative writing, blogging, multimedia and other creation tasks, using targeted apps to practice spelling, vocabulary, and math skills, and learning self-assessment techniques (particularly with reading). Teacher outcomes included working together to develop innovations in course content presentation and class structure.
I aim to conduct the evaluation by including all 75 fourth grade students and three teachers in one or two surveys.  As this is a year-long program (and has been in action for several months), I'm trying to figure out how to collect "pre-test" kinds of information... or is it best to draw a line in the sand and call this the pre-test, and at the end of the evaluation do another survey which will serve as the post-test?  I don't think this will be very authentic, and I'm hoping figure out a solid plan for handling the surveys.  As the evaluator, I also plan to conduct a few "snapshot" visits of classrooms where I take stock of what kinds of activities I am observing to draw out any patterns or trends.  I would like to use student-created work as a data source, but I'm not sure how to make that an "objective" collection. I don't just want work that shows how the teachers are using the devices for certain skills but whether or not they are doing it. The keyboarding skills are the only behaviors that I feel I can evaluate with an outside assessment (some kind of typing test), although even that is a bit artificial because we are concerned with seeing how students are integrating keyboarding skills in their habitual and natural use of the Chromebooks (impact vs. effectiveness).  I understand we're meant to shy away from interviews, but I am hoping to distribute a structured questionnaire to the media/librarian as well as the principal to see if they are able to offer their data (thoughts? opinions? observations?) on students' independence as well as teacher collaboration with the devices.  As you can see, I'm still struggling with the data sources and the best ways to collect data keeping as objective as possible.  The stakeholders involved in these tasks are definitely on board with this evaluation plan, and are happy to accommodate.  I just need to reiterate to the teachers, especially, that this is not a judgmental/critical task!
Completing the Outline of the Evaluation Report was a welcome exercise, and helped me wrap my brain around this task.  I also completed assignment option B, which forced me to organize my "knowns" into their appropriate category, and to think critically about aligning evaluation data sources/tasks to evaluation questions.  I know this will require a lot of revision, but it was a helpful starting point.  I'm counting on our textbook and the contents of this course to offer more guidance in establishing good evaluation questions, selecting the best data collection methods, and properly (objectively and thoroughly) analyzing results.  It seems like a daunting task to me right now, as this is such a big project to scrutinize. 

Tuesday, February 3, 2015

EdTech 523: Leading a Discussion

This week a classmate and I led our peers in an online discussion. We meet synchronously using Google Hangouts to plan out our question prompt and the facilitation/schedule of our replies.  There were 71 discussion entries, many with layered responses! Overall, it was a success. Here is what we came up with:

A Technology Integration Specialist is proposing the adoption of an LMS in her school district.  Two important tenets of its purpose are to build community and to personalize learning.  For the first part of this discussion you will be assigned a role and will write a response to this proposal from that person’s perspective, as you imagine it. (Please click here to see what your role will be!)  In your second post you will step back out of your role and consider all the viewpoints presented.  Choose one or two stakeholders and address their questions and concerns from your own experience, insight, and resources.  Note: Please submit your first reply by January 31st, and the second by February 3rd. Use this week's discussion forum to reflect on connections between personalized learning and community building.

Here are some of my replies to classmates as I encouraged them to elaborate or take another viewpoint:

1. Karen-in-disguise, thanks for sharing the The Current Ecosystem of Learning Management Systems in Higher Education article.  It was an interesting read, as I'm always interested in studying analysis/evaluations of real-life implementations (though this one was focused on higher-ed LMS integration).  It's refreshing to see how the "theoretical" of a technology initiative pans out.
When you say you can attest firsthand about LMS's value, I am wondering how, as a teacher and student, you have learned about an LMS's functionality and capabilities. The above report indicated that a majority of instructors are only using them for sharing content versus for developing "interaction or engagement activities" (p. 10). What do you think is the reasoning behind this? What kind of training is involved on the teacher and students' part to maximize its potential? Do you think it takes a certain kind of personality/philosophy for an educator to fully embrace an LMS, or can the tools provided by an LMS help change an educator's teaching philosophy and approach? (i.e., 'If you build it, will they come?")
2. Although this testimonial appears on a particular LMS's website as a positive promotional piece, it made me think about your situation.  Here is an example of a school who adopted an LMS even though they were already doing extremely well without one.  What prompted Minnetonka to jump on the bandwagon? Does anything about their decision resonate with you?
3. This just came to me, but I was wondering if anyone has ever been part of an institution that used an LMS as a tool to teach the LMS (as a training alternative)?  Did that experiment work out, or did it just add mayhem and confusion?  (I was thinking about how, with our students, we try to teach with technology instead of about the technology... Could this work with adults?)
4. All fair points... as an additional point of reflection, do you feel that the development of community and the personalization of learning go hand in hand in any way?  If not, why?, and if so, how might that look differently in person versus online? (Does the online option appear any more efficient or effective?)
5. (playing devil's advocate...) As the president of my senior class with a 4.0 GPA, busy extra-curricular schedule, and a stack of scholarship applications to work through, I'm not sure I see the merit in an LMS for students like me, who are already achieving success in the current model of education. Yes, I understand that an LMS is an efficient way to receive course materials and stay organized with timelines, grades, and immediate feedback. However, why should I suddenly have to concern myself with "building community"?  Throughout all my educational career, I've been striving to be #1. That's what it's all about. It's a competitive environment in my school, so I do what I need to to get good grades -- I do what the teacher asks of me, complete my work on time, and study hard for tests.  Why would I need to "extend classroom conversations" with my classmates?  I prefer to learn the facts, do my job, follow my own interests, and worry about my own achievement. (After all, isn't that what personalized learning is all about?)   I'm not trying to be mean, but I just don't care that much about what my classmates have to say about stuff. Most of the time they don't care about school like I do anyway. I don't want to waste my energy in chat rooms with them, as I prefer to spend my time learning what my teacher and my textbooks say.
6. Thanks for your ideas. I look forward to more rewarding and enriching conversations with my peers in college versus high school, but for now I can appreciate what you have to say about getting comfortable with the system and the format.  I will give it a try, especially since you point out that it will give me a leg up in my future classes.  I am still wondering, though, besides allowing me to exchange ideas with classmates, how else can this digital model of education improve my personal learning?  How exactly can my teacher use this tool to help me get to my own next level?
7. I'm glad as a student you've had such a positive experience with an LMS helping your education feel more personal and meaningful.  Can you give us an example of how the LMS offered you more choice?  How does it help you delve into stuff you're interested in while still meeting course objectives?  I'm also curious to know if you enjoyed the community-building, social aspect of the platform, or if you found that a distraction? Thanks for being a great advocate! Glad it worked out for you, man!
8. Rebecca, I really like your suggestion of recognizing and building confidence in the "technology brave" teachers who are willing to try something new.  I think this is such a critical piece of a tech initiative's success.  We encourage our students to take risks, and try to foster a classroom culture of exploration and skill incubation-- the same should be said for the teaching environment. How might an administrator model this kind of safe environment?
Also, what happens if the goals and mission of the school do not easily invoke a "need" for technologies such as LMS?  This is something happening a bit in my school right now, where teachers feel strapped for time and are prioritizing other administration "mandates" (expectations?).  There is definitely a perceived disconnect between district goals (and assessment measures) and a general desire to improve technology integration.  I know this is kind of a huge question, but how can an administrator help teachers blend these targets besides just patiently hoping they will jump on the train?
9. Great idea, Nathan, about every teacher having a personal goal for technology development throughout a school year -- no matter how big or small.  Perhaps it could be tied to their Educator Effectiveness reports and profession evaluation measures if accountability and prioritization is a concern. Have any of you witnessed this kind of "open-ended" but mandatory technology goal asked of your teaching staff?  Was it well received?
I wonder if some teachers need ever more direction and structure.  For example, "this term everyone focus on a student creation task, next on a transformative collaboration task..." or, "this year we are ALL focusing on LMS (to whatever degree you feel comfortable), next year on BYOD teaching strategies, the next on technology-driven assessment tools", etc.  Since technology integration is such a big can of worms, it could be that some teachers feel more comfortable if the expectations placed on them are more focused and directed in theme (although the tasks/tools themselves can be varied...)
In a nutshell, how should we frame our technology expectations?

Monday, February 2, 2015

EdTech 505: Evaluation Design

Evaluation Design

Review Exhibit 2.1 (Evaluation Design: Company-Based Wellness Center) on pp. 41-42. 
Discuss it in terms of chapters 1-2. How does it pertain to chapters 1-2? Anything strike you as odd? Anything missing? Any questions? If you were the evaluator, is there anything that you'd do/state differently? What/who would you be sure to include?

Although it's a little challenging to understand the scope of the wellness center program without all the details, one of the first things that struck me about the evaluation summary was the program objectives. Main program goals of reduced depression and on-the-job productivity seemed a little unusual for a (presumably "extracurricular") company-based wellness center. I couldn't help but wonder why those were chosen (above others) to be big goals of the program, and I wondered if they had been articulated to the staff as well.  (And, if so, what they thought of them.)  That being said, the evaluation questions fit the stated program goals well, and at the end of the day the role of a good evaluator is to frame guiding questions based on stated (vs. perceived) program objectives.

Another potential issue is the timeline that is implied in the program goals. Twelve months may not be enough to conclusively determine program "impact", however, may be too long to give credence to shorter-term outcomes (unless those kind of formative benchmarks are built into the evaluation process, which is difficult to tell from this chart). Equally, one might deem this program effective because "95% of participants will have improved cardiovascular endurance", but could that be a case of "focusing on trivia"? (i.e. "the processes are effective but have little or no impact within the work environment" (p. 36.)

A limitation that might skew results of this evaluation is wide disclosure which leads to scrutiny and criticism. As Chapter 2 explained, people do not care to reveal their shortcomings, and this might lead them to behave differently during the evaluation or to provide inaccurate responses. This is especially significant due to the sensitive nature of the activities and data sources in the wellness evaluation. Reflective journals, psychologist reports, physical examinations, and personnel evaluations are extremely private (and potentially stress-inducing) documents. An employee would rightfully feel quite reluctant to divulge such intimate information to an employer, as there may be other risks in doing this (which are tied to the social and political atmosphere of the company).  Therefore, if an employee knew they were part of the evaluation, they might lie on surveys or in meetings to avoid any kind of negative impressions or feelings of failure/humiliation. I think the evaluation of the wellness center is a great example of the need for "involvement of the consumers during the formation of the evaluation design and their input into the evaluation questions [to] help ensure that the results will be used" (p. 37). Dealing with sensitive data could otherwise lead to significant evaluation limitations.

Reply to peer on discussion board about anticipated and unanticipated outcomes when evaluating:
I agree with all of your thoughts on the "pre-ordained" change idea.  I feel like, in general, many of us (maybe it's a societal thing) lack the patience that is required to make judicious steps toward change.  I know that I, perhaps like you, feel that "thinking through" a decision is sufficient to merit action. We already kind of know what we want, and it's all a matter of rationalization. We make things FIT our vision of change.
Systematic evaluation is cumbersome and tedious, but methodical, fair, and forward-thinking.  It offers a chance to view perspectives or angles of a problem/scenario that may have been hidden before, and it makes data objective versus opinionated, emotional, or tied to "the loudest voice in the crowd".  Hopefully as we come to learn more about "formal" evaluation processes, we can find points of compromise where our impatient desire for immediate change or "the next best thing" is tempered with a systematic review of data.  Maybe there are some viable shortcuts for more here-and-now decision-making!

Discussing Chapter Two as a whole:
The section "Benefits to Sponsors and Staff" stood out to me because of its optimistic promise of collaboration and group reflection. Ideally, in an educational setting, teachers would embrace objective evaluation activities because they would back up their "inklings" about a program's value, effectiveness, and impact.  It would be a reason to gather with colleagues to discuss the program and its processes as well as the "evaluation's philosophy, its procedures, and its anticipated outcomes" (p. 34).  In other words it would provide an staging ground for staff to collaboratively stop and reflect on their program's design and activities in a natural, conversational way.  It could rekindle feelings of pride, ownership, and advocacy.  It could also lead to building a framework for colleagues to jointly select and organize "standards, indicators, evidence, and resources".  Evaluation serving as impetus for discussion, colleague appreciation, and program scrutiny is huge benefit in its own right... not to mention the hard data and "next steps" that will ultimately come from it.

Sunday, February 1, 2015

EdTech 505: Week 3


Week 3 Assignment
a) Boulmetis & Dutwin use three guiding questions to lead us in resolving the question of why to evaluate:
• What are the benefits and limitations of an evaluation?
• What factors ensure that an evaluation will be successful?
• How might one use evaluation results?
Answer the three questions above as they apply to the following scenario. (200-400 words in 1-2-3 order or use an alternative way to answer.)
Rich Kids, Poor Kids
The Maricopa Community College District (MCCD) partnered with Cashflow Technologies, Inc., to design and deliver a course in financial literacy. The course is based on the products - books, games, and videos - created and sold by the company. Components in the course include training sessions for future course instructors and seminars at which attendees play the Cashflow game. MCCD commissioned an evaluation of the program to verify its value to students and the community. In addition, Cashflow Technologies is exploring the possibility of marketing the educational program to educational institutions throughout the nation so an educational evaluation of the program should lend it credibility. Cashflow Technologies has been producing the materials for three years. The course was offered for the first time during the fall 2011 semester. No training sessions or seminars have been performed.
The program will be evaluated by students in Educational Technology at Boise State University. The evaluators have access to sales information for the products, student performance data from the course, and contact information for the students and instructors. The budget for the evaluation is $6,000.
b) Look back at the assignment you did last week. You described a potential program for your evaluation project. Now having read chapter two, you can answer these questions in reference to that evaluation:
1. Would the program you detailed in Chapter One benefit from an evaluation? If so, how?
2. What are the inherent limitations in the evaluation of the program?
3. How might you use the results to benefit the organization, community, schools, or yourself?