Saturday, December 12, 2015

Final Project Working with Unity

Here is my project I created for middle school students learning about the Elements and Principles of Design. They are looking to improve their photography and graphic design for school reporting and publications. This online activity offers instruction and self-assessment to review this terminology and apply it to new images. With further development this project will include and interactive gallery to leverage the multimedia, 3D power of the Unity platform. The scripting, navigation, and organization of game elements was a very labor-intensive process! It's definitely a pride-worthy product.


http://ewray-511-finalproject.weebly.com

Direct link to game (requires browser with Unity plug-in)

Friday, May 1, 2015

EdTech 523: Final Project Submission

Technology Integration for Teachers 101 - A Schoology Site

FINAL COURSE PROJECT

My Project plan

My Evaluation plan (links to surveys and rubrics are embedded within)

Project URL: https://www.schoology.com/course/235200272/materials Access code: 3FDXQ-7BF2M



Sychronous Presentation of Final Course Project

Please visit my Self-evaluation document

(I was delighted to have received an A+ on this project from Dr. Rice!)

Thursday, April 30, 2015

EdTech 505: Week 15 - Final Evaluation Report

Yes, I'm finally in the home stretch!

Please view my final Program Evaluation Project by clicking here.

Discussion Board Reflection
At the beginning of the semester, I had no experience in evaluation in this format, although I looked forward to this course because I knew it would be a pivotal skill in my new (this year) role as Technology Integration Specialist. There was a number of pilot programs rolled out throughout the district with different devices, and I'm still trying to wrap my head around them all! Researching various tools, having "EPD" style conversations with colleagues, and utilizing concrete data collection techniques to help determine a program's success or merit will certainly not end here for me! It has been a very very busy semester and a very very busy process all in all.

Sunday, April 26, 2015

EdTech 505: Week 14

This week we are continuing to collaborate with our small groups (asking questions, etc.) as we gather/analyze our data and begin to write our report.


Discussion Board Posts

Three crucial things to be included in an evaluation:
1. Regular, brief meetings about progress and action-points/deliverables -- ongoing communication throughout process
2. Well-crafted and varied data collection instruments and, similarly, transparent and easy-to-interpret results
3. Suggestions of areas for improvement based specifically on data results that are unique to our program (not just canned, generic responses)

One critical job skill: Organization (of people, resources, and time)

I would set up the evaluation to include:
- thoughts/ideas from all stakeholders at the very beginning (cast the net wide)
- an evaluation team internal to the company that helps work directly with the evaluator (but is a voice/go-between of the relevant stakeholders)
- early planning stages to include the evaluator
- the development of clear objectives tied to clear activities tied to clear evaluation measures/questions
- many avenues and formats for data collection
- A lot of time spent on careful data analysis
- A well-crafted report with a lot of visuals to make the relationships/patterns among the data very clear
- Several formative evaluation processes before a summative evaluation
- Transparency and communication as much as possible to encourage trust and a spirit of joint ownership

Reply to a peer:
Yes, gap analysis was also a big takeaway for me.  Taking stock of present and future and concrete steps to bridge the two seems like a straightforward way to design a program with the "end" in mind.  The vision of the future would help build the backbone for the program objectives, which would then be aligned with evaluation questions/tasks. Being detail-oriented as an evaluator would help the program staff ensure none of these steps were left out in the process. It's also helpful that the evaluator is focused exclusively on these kinds of details instead of a bunch of other tasks/duties within the organization (which prevent one from seeing the "big picture").

Sunday, April 19, 2015

EdTech 505: Week 13 - Square Wheels

Discussion Board

Task: Generate as many thoughts as you can about this illustration and your reactions, relative to program evaluation.

Who/what does the person in front represent?
This is the project leader, who often ends up pulling more weight and doing more legwork than offering direction.  In this image the leader is barely looking beyond his/her own feet. They may not be seeing other possibilities on the horizon. There is not a lot of future-thinking because this person is arduously moving and pulling the program along just one little step at a time.
Who/what do the people in back represent?    
These are the project facilitators and teammates.  They, too, are working hard to get the job done.  They are even working together, making sure their steps are in sync and that no one falls behind.  Nevertheless, they can't see up ahead because of the size and stature of the program (the wagon) itself. In some ways, they feel like they are just following blindly and according to the directives from up ahead.  There is no time or opportunity to pause, catch one's breath, reflect on the process, or change course.
What does the body of the wagon represent?
This is the program.  It's a big task and a has a lot of parts!  It requires a lot of coordination to implement and keep it running.  Nevertheless, the project stakeholders do whatever it takes to keep it moving in the only way that they know how.
What do the square wheels represent?
This is the status quo of doing things. (It's the result of the first iteration of the program.)
What do the round wheels represent?
This is the potential for improvement.  It represents the collection of opportunities and resources to create a better program. This includes designated evaluation personnel, planning time, collaborative discussions, testing measures, pre-existing data, regular review/reflection of objectives, participant feedback, etc.  This potential always exists and is a part of every program.  It's just that sometimes we overlook this potential or don't take the effort to tap into it to improve our program delivery.
What is the overall vision or interpretation of the illustration?
Typically we have the capacity and tools to make our programs more efficient and effective, but we aren't utilizing them. Instead, we are carrying this "potential" around and carrying on with programs that are bulky, clunky, inefficient, and barely effective. If we recognized the importance of taking stock of our team's skills and opportunities for growth we could not only reduce the "weight" of our program, but also coordinate efforts to get things moving along more efficiently and effectively.  For example, instead of just sitting on data and shuffling it around year to year, we could comb through it and help it grow legs to get us from point A to B in a better way.
(Response to a peer:) I hadn't thought about the role of pride in this equation.  Interesting viewpoint.  In my case I haven't seen pride so much as laziness and comfort.  We know how to do something, and it keeps going forward somehow or another, so why stop to see if it can "go" better?  Some leaders or staff members might say if it ain't broke, why (look for reasons and ways to) fix it?  If they could press the pause button on life and fast forward to a better version of their program, then they would see the rationale.  But it's hard when you are in the present just pulling or pushing forward.

Saturday, April 18, 2015

EdTech 523: Final Discussion

Discussion Post Directions: Each is a TV show that is a subgroup for the discussion forum. Click on the provided link of your choice and post why you feel that synchronous online learning is like that TV show, using examples from the readings and prior learning.

My Contribution: I, too, believe that Synchronous Learning is a lot like the show The Voice, and for many similar reasons that have already been expressed. The element of coaching is a hallmark of this program. Each coach's team is made up of a diverse group of singers. This is similar to the diverse learning styles and interests in an online class. The coaches, professional (and famous) singers themselves, start with high expectations for their "students" and guide them throughout the season to be the best performers they can be. This is similar to the synchronous instructor. "Instruction encompasses any of the kinds of learning that happen when faculty members, knowledge experts, or facilitators meet with learners, usually in a planned manner in a specific online venue, to guide them through the achievement of learning objectives (Finkelstein, 2006, p. 3). On the Voice, as in the synchronous online classroom, coaches make suggestions and offer additional resources (e.g. famous fellow celebrity singers) to help the contestants/students improve their craft. "Support is a crucial element for retaining and motivating learners, whether it is provided by just-in-time assistance from a peer, instructor, tutor, advisor, or librarian (Finkelstein, p. 4). Feedback is prompt and bi-directional. 

Another important similarity is the process of blind auditions where the would-be coaches are listening to the singers without the benefit of face-to-face communication. This is similar to synchronous learning in that the teacher does not have physical presence with her students. Being a good listener is of utmost importance. Likewise, for the student, expressing oneself clearly and impressively is of paramount importance; it is all a teacher and classmates have to "go by". Developing communication skills that don't rely on body language, eye contact, real-time discussion, etc. is similar to the Voice contestants who are performing with only the quality of their sound to represent themselves. The coaches (like the online instructors) are assessing the singer/students' demonstration of "real-time skills and analytical thinking" (Finkelstein, p. 6) in every performance. 

During The Voice season, there are various "rounds" that involve both competition (knock-out rounds) as well as team-work (duets). This is similar to the kinds of activities that might be incorporated into a synchronous learning class. (However, this would typically err more on the side of cooperation and collaboration!) Each week is a new song, a new task, a new performance. The class, like the show, is dynamic and always looking to stretch the singer/learner's "range" or style. In both cases the audience is real (TV audience and Web 2.0) and exists in real-time. "Real-time learning environments invite active learning" (Finkelstein, p. 22). In The Voice as in good synchronous classrooms, activities are meaningful and grounded in real-life tasks. "The presence of a live instructor, combined with the use of the human voice and a rich set of facilitation and collaboration possibilities, opens up a new world... (Finkelstein, p. 7).

Tuesday, April 14, 2015

EdTech 505: Week 12 - Critique of an Evaluation Report

Assignment: Critique an Evaluation Report ("The Maine Experience")


Discussion Posts:

#1: Program Evaluator Job AdsDiscuss what you find, or don’t find, in these ads. What do you think of the jobs as described? How do you stack up to the qualifications and requirements? Think you could make “program evaluation” a PT or FT business? Other thoughts on the job ads?

These jobs certainly do reflect the skills we've overviewed and practiced in this course. Their descriptions include most of the components we included in our Response to RFP (meetings with stakeholders, designing various eval methods, traveling and overviewing the administration of the evaluations, collecting and analyzing data, offering a report and making suggestions if requested).  Upon first glance, they don't seem TOO out my league after working through this course (and from my previous experience & educational coursework).  I know a job poster is going to err on the side of the VERY specific! However, there are some things that made me feel a little ill-prepared to take on one of these positions.

First of all, many jobs ask for the applicant to specify their evaluation skills (e.g. "experience employing social science empirical research methods, theories and analytical approaches" or "experience using a mixed-method approach").  Outside of learning the theory and ideal techniques of program evaluation and completing one formal evaluation, I simply don't have these kinds of extensive and varied experiences! In addition, very few of the jobs are seeking teachers or educational technologists.  A large amount of positions also request expertise in statistical/data analysis, including special software packages.  This is one area that is definitely still quite out of my league!  The words "metrics", "audits" and "IT" come up quite often. I would have a big learning curve with those aspects.  Many also are looking for people with experience in project management, which I just don't possess. I haven't pursued any administration coursework or experiences in my career so far. It makes sense to me that many request experience with research or grant writing.  (I do have some of that.)

Overall, it seems that the ideal candidate being sought after is one who not only has experience in the specific field (healthcare, social services, etc.), but also has experience being on the other side of the table (a "leadership" stakeholder -- instructional designer, grant-writier, program manager, etc.)  They also want an excellent writer (many requesting sample of one's work), mathematician, designer, coordinator, etc.  These jobs seem pretty intense! I would've thought they would pay more than they do.  (At least in the case of the ones that post their salary...)  A lot of work is involved and a lot of expertise is expected. (Where are the entry-level positions?!)  For now, for me, I'm content with using program evaluation skills within the context of my current teaching career.

#2: Discussion of end of book

Yes, the more resources (and exemplars) the merrier.  One big difference, I feel, between Appendix B's example report and the one we're required to to for this course is that there is greater treatment of "recommendations" than we are required to provide in our own report's discussion. Appendix B also reports results according to each program objective, however I don't believe that is expected of us in this course's final report (although, if it makes sense, I can see how that would help organize and frame the discussion of data).  Appendix B's report also includes a lot of data from focus groups and interviews, which I understand that we were discouraged from using in our own report for this course.

Chapter 10 was a good pep talk for evaluators looking to permeate the professional field. Being proactive in the search for work seems like a daunting task!  At least it ensures you are evaluating something you're interested and experienced in.  The discussion of "emergency resources" was interesting, as it got me to consider the ethical dilemma of evaluating -- is the relationship with the organization or the evaluation results more important? Definitely a tricky scenario!  The discussion of contract negotiation was definitely something I'd never experienced before. Program evaluation seems quite entrepreneurial in nature. A lot of self-direction, seeking out clients, and building a reputation is involved -- not to mention the evaluation processes themselves!  The first few years must be tricky.