Twearning: The Experience

Chart Comparing Grades
Chart Comparing Course Grades Spring 2012 to Spring 2011

The jury’s in. The verdict is: Twearning was modestly successful.

Twearning

Twearning is the use of Twitter in the classroom to promote student engagement and learning. In this post I explained how I had incorporated the use of Twitter in the Sports Marketing Law and Ethics class at my University. The class was composed of juniors and seniors at my university. It is a required course for the Sports Marketing major. Most students were Sports Marketing Major.  The class was taught as a face-to-face class. The class had 18 students. One student was female; the remainder male. Students represented a variety of ethnic backgrounds.

Use of Twitter in this Course  Follow Icon-Twitter

Students were required to do several things:

Tweet one time during class and twice weekly outside class (15/315 points)

Provide group summaries of tweets for the previous week (15/315 points)

Follow tweets of 3 professional athletes and write a social media policy based on the information (50/315 points)

Student Reaction

The following are unedited student comments.

Best of Using Twitter

  • it helped me with my classmates easier. If I had a comment or curiosity, it was easy to get a response and the information i needed.
  • Seeing how it can be used both professionally and casually as well.  As well as quick communication with a very wide
    Classroom Clipart http://mbrs.classroomclipart.com/

    variety of people.

  • Best thing was the social interaction in and out of the class. If someone needed to ask a quick question, they could easily send a tweet or direct message to someone and get a response back, fairly quickly.
  • Made me stay up to date with the course material, made sure that I was engaged during class time as well.
  • Following athletes (2 students)
  • That every chapter was summed up with the use of twitter and in our own words which helps us learn because most students can relate to the way we learn information.
  • That it made everyone post something about the course in their own words.
  • I got to communicate with my class mates and view the most popular topics and it helped me review and memorize course material
  • The best thing about using twitter was that it kept me active in the class and out of class.
  • learning new social media
  • I appreciated using twitter in class because it allowed us to read material and summarize what our findings. It also helped keep us up to date with a world of technology that is evolving very fast.

Worst of Using Twitter

  • It the hard was remembering to tweet all the time. it was not bad to use at all.
  • Sometimes the character limit.  But that forced me to be concise.
  • Having to tweet twice outside of class was probably the worst thing. Students would wait till the last minute to tweet and it would consist of some random fact in the book. I feel that tweeting during in class is more effective.
  • It was another thing to have to remember to do outside of class, also finding the tweets of my classmates for the group summaries was time consuming.
  • Posting 2 tweets outside of class
  • Saving the tweets and having to read through them for possible legal issues.
  • On the learning aspect nothing was wrong, just making every tweet count and worth giving the right information.

    Angry Girl
    http://mbrs.classroomclipart.com/
  • It kind of became too much after using it over and over again
  • I did not have any problems
  • I have nothing bad to say about twitter. It was fun to use for class.
  • use was unrealistic
  • I found that using twitter sometimes took away from personal interaction with classmates and professor. However, it seems that technology is taking us that way everywhere we look.

Preliminary Conclusions

Student performance, as measured by exam results and course grades, was better. An implication from the exam results (noted in earlier posts) and the course grades was that students in the middle performed better. Students at the top tended to perform well no matter what the format.  Note that I’ve only included raw, unedited student comments here. I have not yet conducted an analysis of the pre and post exam results nor have I compared the pre and post surveys of student perceptions of Twitter use and student engagement.

The following are first-blush comments. The student comments summarized here indicate:

  • It was a useful tool to communicate with each other
  • It was a useful method of learning by summarizing and seeing their classmates’ summaries of the material

Students liked least tweeting outside of class. That’s an interesting point because  the students also seemed to find the summaries of those tweets one of the best things about using Twitter in this course. One thing which I noted in a previous

iPhone
http://mbrs.classroomclipart.com/

post, is that permitting students to use their laptops and, gasp, cell phones, did not hurt students’ performance in the class. This was contrary to what I expected when  decided to, for the first time, drop the no cell phone rule.

Road at Sunset
http://mbrs.classroomclipart.com/

This may seem like the end of the road. The exciting part is to conduct more analysis to determine what worked, what didn’t and why.

I’m considering this for one of my online classes in the fall; it may help foster more student engagement. Also, the withdrawal rates tend to be high in the particular class I’m thinking about and Twitter use might help reduce that rate. I’m also considering other uses.

This has been an interesting journey. More to come…..

Still Adrift in Education

In his essay,’ Academically Adrift’: the News Gets Worse and Worse, Kevin Carey, explains that there is more information that not only do college students fail to learn in college, but also that students who perform lower on the CLA (Collegiate Learning Assessment) also fail to find financial security after graduation.

In an earlier post, I discussed some of the conclusions I reached from the sections of the book which I had read. Those conclusions were:

  • There is an inverse relationship between the number of faculty publications and a faculty orientation toward students.
  • The higher students’ grades in the course, the more positive the student evaluations.
  • Grade inflation probably exists.

In a later post, I discussed critical thinking as a concern: that students don’t “enjoy” the challenge of traditional problem solving the way I (and other faculty) do and that has an impact on whether students learn. If students do not see tackling and solving problems as a challenge (and we as educators should do as much as we can to make problem-solving interesting), then there will be a significant impact on student learning.

A Not So Radical Transformation in a Core Business Course

In the introductory business law course that is required for all business majors, all the faculty teaching the course agreed to make substantial changes in the way the course was taught in order to acknowledge and address perceived efficiencies: students lack of college-level ability to read, college-level ability to write and need to improve critical thinking. Students complained a great deal about the additional work.

Assessing and Working to Improve Reading Skills

Although my own experience with students confirms that it would help for them to have more practice reading and writing, the students did not agree. When asked whether My Reading Lab (a publisher-created product) helped them, students said no:

WhetherMyReadingLabHelped-BA18F11

Note that this response is only the student’s perceptions. We have not yet completed an analysis to determine whether those who performed better on My Reading Lab performed better on the tests or in the course. We will work on analyzing that data later. This also does not included longitudinal data, i.e. would students, upon reflection, decide that they had learned more than they thought by the additional practice reading. However, what this data does show is that students did not embrace the additional reading practice and testing requirement.

Reading the Textbook

Student preparation for class is a concern. Many students do not read before attending class; they attended class then read after class.  In addition, students did not study. As part of the course redesign, we required quizzes prior to students attending class. Students (74.2%) agreed that the quizzes helped them keep up with the reading.  Even though the students said the quizzes helped them keep up with the reading, many still didn’t read everything. The following graph lists the students responses about whether they had read the textbook (this is at the end of the semester):

Percentageofreadingscompleted-BA18F11

Note that 40/202 or 19.8% read 90% or more of the readings and 80/202 or 39.6% read 80-89% of the readings. That means that nearly 60% of the class read 80% or more of the readings. These are the results obtained after faculty required that students read and take a quiz on the material before attending class. Thus, students were more motivated to keep up with the reading. How would these results differ if the students had not been required to take a quiz before attending class?

Studying

Student preparation and studying. The following graph includes information on the hours that students studied.

TimespentstudyingBA18F11

According to these self-reports, 21.2% of students studied between 1 and 3 hours per week, 27.7% of students studied between 3 and 5 hours per week, and 21.7% of students studied between 5 and 7 hours per week.  Students should have studied nearly 8 hours per week (2 hours per week outside class for each hour of class-this was a 4 unit course). In Chapter 4 of Academically Adrift, the authors note that students report spending 12 hours per week on their courses outside of class.  According to figure 4.2 of the book, in a 7 day week, students spent approximately 7% of their time studying.

Conclusions so far

The educational process requires that the faculty and the student participate, and if the students have not completed their share, then education and learning wouldn’t necessarily take place. I don’t know how this data compares to other studies on student reading, but it is challenging to help learning if both parties are not fully invested. Students have a variety of reasons for that lack of involvement, but if the investment in education is relatively small, then improvement in learning will be small.

In addition, this past semester, my student course evaluations were much lower (this was also partly due to a change in the institution’s survey evaluation instrument). Because I am tenured, I do not face losing my job over the changes in my student evaluations (although adjunct faculty face a different reality when it comes to being rehired). However, adjunct faculty depend on good student evaluations in order to keep their jobs. If that is the case, adding rigor to a class could cost that faculty member his or her job.

Using Research on Learning to Guide Teaching: Huh?!

It seems perfectly sensible and logical. As educators, we should take advantage of the research on how people learn and use it to guide our teaching. But we don’t! Instead, we stick with the tried and true (I did it this way, I learned this way and if students don’t get it, that’s their problem!) I’ve discussed this issue in other posts, for example, Is Higher Education Ready to Change, but it’s worth repeating.

Harvard recently held a one day symposium on the issue to try to encourage faculty to incorporate cognitive research findings into their teaching. This conference kicked off Harvard’s receipt of a $40-million dollar gift. The gift forms the basis of grants to faculty for Harvard’s Initiative on Learning and Teaching.

In a Chronicle article, Harvard Seeks to Jolt University Teaching, Dan Barrett summarizes explanations of the purposes for the symposium and workshop. Barrett quotes Dr. Weiman, a Nobel prize winning physicist, who has conducted research on science education and how students learn, and who explained that faculty often teach by “habits and hunches.” This is partially because most faculty are content experts and not pedagogy experts.

Other conference speakers noted that students are changing, and that, for example, students are not as curious as before.  Dr. Mahzarin R. Banaj debunked the popular belief that teaching should be designed to fit diverse learning styles-e.g. kinesthetic or visual styles. Others noted the importance of quizzing and frequent writing.

So what dDivingoes this mean? It means that Universities should encourage faculty to develop evidence-based teaching practices. It means that faculty workloads would have to be adjusted to permit time for faculty to implement and evaluate new methods of teaching. It means that Universities should assist faculty to assess the impact of these new methods of teaching. The University of Central Florida has a center devoted to helping faculty assess the impact of their teaching.  I’m ready to try it!

ePortfolios for Assessment

I am attending the Western AAEEBL conference in Salt Lake City Utah on ePortfolios. Helen Barrett was the lunchtime speaker and she provided a great deal of information which I have compiled in tweets at the #11WAAEEBL hastag. [To find those, go to Twitter and type that hashtag in the search box.] Barrett discussed 3  points that I want to note here:

  1. Label the eportfolio with an adjective so we know its purpose, e.g. learning eportfolio
  2. Mobile technology is important for future technologies
  3. Digital storytelling is more than entertainment; it’s also a method of learning

Those items have given me food for thought as I continue my journey to determine whether ePortfolios are solid assessment tool. I’ve discussed this a little bit in a previous post.

The “A” Word-Using ePortfolios

The “A” word is Assessment. I blogged about it a couple of days ago and noted that I’d talk about my foray into ePortfolios.

I am reminded of the saying “something old is new again” (although I can’t recall it’s source….). At one point in my children’s education, portfolios was popular.  Some of  you may recall that period. My children were asked to collect their papers to present to teachers and outsiders to evaluate their work. I recall at least one of my children had a porfolio filled with crumpled papers that demonstrated that he wasn’t as concerned with appearance as content!

ePortfolios are based on similar principles. In October, 2010, I had the opportunity to attend a conference at which Dr. Helen Barrett, preeminate expert on ePortfolios, made a presentation.  ePortfolios can be used as formative and/or summative assessments. Dr. Barrett summarizes ePortfolios as ” an electronic collection of evidence that shows your learning journey over time. Portfolios can relate to specific academic fields or your lifelong learning. Evidence may include writing samples, photos, videos, research projects, observations by mentors and peers, and/or reflective thinking. The key aspect of an eportfolio is your reflection on the evidence, such as why it was chosen and what you learned from the process of developing your eportfolio.”

I used ePortfolios during a one year period and hope to use them again in the Spring 2011 semester in at least one class. I used Mahara, an open source ePortfolio system. I used the ePortfolio to (1)encourage student self reflection on their learning as related to the course learning outcomes and (2)encourage student reflection on the work in my course and other courses as they related to the overall mission of the school. I’m currently compiling the results of that use, but the results were mixed, as indicated by the following table.

Table-ePortfolio Student ReactionI still have a great deal of work to do to use ePortfolios to more effectively support assessment, self-assessment and metacognition. But I have great hopes that they can be used, in conjunction with other assessment tools, to reliably and validly assess learning.

The “A” word–Assessment

Measuring student learning is one of instructor’s most difficult tasks. Assessment is also a difficult task for institutions.

In the article Measuring Student Learning, Many Tools, David Glenn, discusses the issue as an institutional issue and points out that a group of institutions have combined to study different methods of assessment. The group, headed by Charles Blaich, director of Wabash College’s Center of Inquiry in the Liberal Arts, seeks to collect data to determine effectiveness. Cr. Blaich encourages universities to use a variety of tools, as appropriate for the school, to collect data. He also encourages universities to use data they already collect, when possible.

I’ve used a variety of assessment methods in my classes: exams, scoring rubrics, ePortfolios using Mahara (an open source program) and now possibly Taskstream and  computer based testing (such as Criterion, a writing program). I have tried mind mapping, graphic organizers, research papers, short papers, multiple quizzes, take home exams, and oral presentations.

The tension is palpable. I can measure whether someone has memorized the content most easily through a test. I can measure critical thinking and ability to apply through a test. However, does that demonstrate learning or deep learning?  How does one measure learning (see this website, Approaches to Study: Deep and Surface, for more on the concept of deep learning) ?  Measure critical thinking? Measure successful integration of information learned with information previously learned?

So, I muddle along, measuring learning based on how my learning was measured (primarily through multiple-choice, true-false, essay, standardized, nationwide, validated tests-depending on when and what) and I add in what I learn from attending conferences, listening to experts and applying what I’ve learned to my classes in an effort to truly encourage and measure learning. Is it successful? It depends on who you ask.

That’s enough for this post; next post I’ll briefly discuss my foray into ePortfolios, my current preferred assessment method when I have adequate time to process the student information.

As you can see,  I will continue to struggle with the “A” word!