Tag Archives: research

Where Do Great Ideas Go to Die?

People have great ideas all the time that they never share with others. They secretly harbor them in their heads. This is often where they die. We’re not always given a platform to share ideas, so that’s part of the reason. Another is we often feel our ideas might not be well received, so why bother. Or change is just hard for some and doing things the way we’ve always done them is commonplace. I tend to lean more on the side of people will listen if you bother to seek out the opportunity even if change never happens.

I have what I think is a good idea, and I’m going to share it with you. I don’t have any expectations for change, but at least my great idea is not going to die in my head. Also note, this post was conceived before our current situation with moving courses online. I started this while on Spring Break.

I’ve been teaching online for a long time – since 1998. I can see an inherent problem with how we offer online classes for our students. We open classes. Students rush to fill them, and all the online classes are full weeks before the semester begins. Sounds great, right? Well, it’s not. Not every student who signs up for an online class is prepared and ready for an online class. Many never make it past the first few days, finding it difficult to follow simple directions and get work completed. What do we do with these students? Some drop on their own, others stay and struggle for a while and eventually drop. The end game is that often after just one week a once full class is now left with multiple open spots. These are missed opportunities for students who were never given a chance to even register.

So here’s my idea. Open all online courses 3 days early and require students to complete an orientation. If students “No Show” or can’t complete simple to-do items, they are dropped as a “No Show” from the class. They were given an opportunity and failed. The student gets a full refund and there is now an open spot for another student to enroll. But we don’t allow late registration, so that doesn’t work. However, if we designated some courses as “rolling overload.” I made that term up. It means that faculty can designate the number of overload students permitted to enroll in their online courses. Presently faculty can teach an online course that doesn’t have the required max number of students (15) and are compensated from a rolling payscale, meaning I can teach ENH114 if I only have 10 students enrolled if I’m willing to be paid a certain percentage of the full load. That number used to be 2.04 load for 10 students. Five students would be 1.08 load. These are just examples at this point based on old numbers.

With this new plan, faculty could designate the number of overload students they are willing to teach, and the load for that class would increase by the number. Then after the three day period where students are given the orientation to complete, the actual course load is determined. Here’s the example: I teach ENG101 with a course load of 24 students. I designate 10 open spots for overload (2.04), so initially, my new full-time load is 15 + 2.04 = 17.04. After the three day orientation period, I only have 29 of the 34 students successfully make it through. My new load is 15+1.08 (5 extra students). We have technically helped 10 students. Five were shown they were not adequately prepared for an online class and were given a refund, and five more were given the opportunity to take a class that previously would have been full and closed. And I am compensated for the extra students in my class.

So let’s look at some real numbers, and I’ll show why I know this will work. For the last 5+ years, I’ve been keeping track of students enrolled during the first two weeks of my online classes. This semester I have 5 online classes. The two online 8-week ENG101 classes ended last week, and two new ENG102 online 8 week classes began this week. I already knew that at least 3 of the students enrolled in the ENG102 courses were not eligible to take the class, but I couldn’t drop them from the ENG102 because the semester wasn’t over yet for the ENG101. They hadn’t officially failed ENG101 yet, but trust me; they failed. So there were 3 wasted spots already. By the time all the official stuff happened, we are already in the no late registration stage. But let’s focus on the two ENG101 courses. I started with 48 students and I ended with 34. After the first week, I had a total of 43 students. So 5 enrollments were lost within the first 3 days. Most of the other 9 students were lost within the next two weeks.

Here’s the best part. I can predict after one week which students will not succeed in the online course. As they complete the 7 step orientation, I rank them in order of how quickly and successfully they complete the orientation. The names at the top completed it quickly with very little difficulty. Names toward the bottom are students who didn’t get started right away, required several emails to prod them, and didn’t complete things in a successful manner. The majority of the 9 students who dropped or were dropped after the first week were at the bottom of this list. Only 3 students in the top 32 have dropped or been dropped from the class, while the bottom 7 have either dropped or are failing the course.

Now let’s look at what is happening right this minute in my two ENG102 courses. The orientation was due last night. Both classes were full before we started. I add one off the waitlist and 2 students from my previous ENG101 that just ended, so I started with 51. One disappeared right when I opened the class on Wednesday of Spring Break. Poof. Vanished. Down to 50. Today a week later, three days into the 8-week session, I have 44 students. What happened to those 6 students? Two more dropped on their own. One said she had too much going on to handle a new class right now. Three were complete no-shows. I emailed daily and then called to no responses. They were dropped with a 43 (no-show) this morning. The last was a difficult decision but he was dropped with a 43 because he couldn’t figure out how to complete the orientation and never responded to any of my emails or texts offering help.

So even with all the intervention I still ended up for 4 open spots that didn’t get filled for this 8-week session. I bet there are a lot of students out there right now that wished they’d just signed up for an online class. But it’s too late now, as those 44 students are already deep into the course discussing personal freedoms and learning about writing arguments. Anyone who tried to join now would be too far behind for it to be a fair challenge. The system is just not designed well enough to give more students the opportunity to take online courses. Who knows if my idea would work. It’s certainly not without flaws. It’s just an idea, and now that it’s not dead in my head, I’m good with letting it go. Fly away idea. 🙂

And Write6x6 is a wrap. I hope you enjoyed my brain dumps over the past 6 weeks. I’ll try not to wait until next year to post again.

Is There Value in Having Students Do Collaborative Group Projects?

Collaborative group projects in online and hybrid classes – Is there value in having students do them?

I go back and forth with whether I should dump it or keep it. Students hate it, but I think there is value, and it’s a lesson students need to experience. Things don’t always go the way they should, and students can learn a lot from having to deal with this adversity.

I’ve been using a group project in my ENG102 hybrid course for about two years now, and I think it teaches students a lot about collaborating, working in a team, and sharing in the learning process with others. In the video below, I’ll share my process with you, as well as a few tools in Canvas that you may or may not be familiar with: Collaborations, Groups, Perusall and NoodleTools. 

Purpose: The purpose of the project is to teach students the process of writing an argumentative research paper. In groups of four the work through the whole process in four weeks. The only thing they don’t do is the actual research. I provide that for them. Let’s take a look, and I’ll show the tools as they are integrated into the process. 

Collaborative Group Projects in Canvas

Week 1: The One Thing You can do to Raise Enrollment

A six week “how-to” series
Week 1, Step 1: How to Impact Enrollment. But first, a story.

My biggest failure happened when I was a wet-behind-the-ears youth leader. I was actively looking to raise money for youth activities and I had responded to an ad pitching a T-shirt fundraiser. The company featured exciting, fun, faith-based designs on sleeveless T-shirts, and, for a limited time, was selling the shirts at a steep discount. The deal involved paying in advance with no returns and no refunds, but these things did not matter because these sleeveless shirts would sell themselves. I used my tax refund money to purchase the shirts. The shirts arrived and we began selling. But, instead of buying the shirts, our friends and families asked: Don’t you have any T-shirts with short sleeves? It turns out that people are so adverse to wearing sleeveless T’s that the fundraiser tanked horribly. It was a hard pill to swallow, but it changed my life.

I learned to never make decisions “based on a hunch.” I came to love data informed decision-making, and I am not alone. In this data driven age, even the youngest consumers are making informed decisions by comparing products, pricing, and reputation, including incoming college students and their families.

You’ve probably guessed by now, the “one thing” you can do is based on what works, study proven methods, and not gut instinct. So, what is the “one thing” you can do to influence the student decision-making process, raise enrollment, and raise GCC’s reputation in an increasingly crowded marketplace?

Before I spill the beans, you should know that conversely, by not doing this “one thing,” you risk falling off your potential students’ radar completely, and losing them to a competitor. There is a lot at stake and much to be gained.

The first step:

Go to www.gccaz.edu, and type your last name into the search box. Take a look at your employee biography webpage. What do you see?  If you were a student, is there anything on your page that would make you choose you?

What’s ahead:

WEEK 2: THE “ONE THING” AND ITS POWERFUL SWAY
When it comes to students choosing your classes, leaving choice up to chance is not your only option.

WEEK 3: THE “ONE THING” AND IT’S NOT BRAGGING
Reputation is king. Making your achievements public enables people to make informed choices.

WEEK 4: THE “ONE THING,” AND HOW TO INFLUENCE ASSUMPTIONS
Learn the top trait people assess when viewing strangers’ photos, and how your face, wrinkles and all, makes people choose you.

WEEK 5: The “One Thing” Before and After
If two faculty are each offering the same class, who would YOU choose?

WEEK 6: The “One Thing” and the Final Step

 

A VP, Dean, & Dept. Chair Walked Into a Bar…

By: Phil Arcuria, Research Director

A VP, Dean, & Dept. Chair walked Into a bar to discuss a new potential GCC initiative (where did you think I was going with this?). The initiative will require various types of resources and they want to make sure it “works”. They are discussing potential steps to take to evaluate the efficacy of the initiative. I happened to be sitting at the table next to them and, being a nosy neighbor, offered the following quick tips to help guide their efforts:
Tip 1: Write down the purpose of the initiative and the expected outcome(s). Verbally conveying it is not enough. Writing it down helps actualize it into something “real” that can more easily be refined and shared. One way to articulate the expected outcome is to complete the following sentence, “If the initiative is successful, ________ is expected to happen.” Or, a slight variant, “In order for this initiative to be deemed successful it must_______________.” Further refine the outcomes to capture the properties of SMART goals: Specific, Measurable, Achievable, Realistic, and Time-specific.
Tip 2: Substantiate in writing how you believe the initiative will result in the expected outcome(s). Be specific and detailed. Incorporate in prior research, best practices, how it has worked at other institutions, etc. If no prior research is available, lay out the logical argument on how the initiative will achieve the expected outcome. Pretend you are on an episode of ABC’s Shark Tank and you have two minutes to convince someone to invest the resources required for the initiative (e.g., employee time, funds, etc.). Craft your pitch and read over it. Does it provide a convincing argument of how the initiative will likely result in the expected outcome? If not, further refine it until can stand on its own. This step takes a lot of effort, but if we are not willing to put the effort into substantiating the value of the initiative, should we be asking anyone else to put effort into implementing it?
Tip 3a: Although Realistic is one of the characteristics of a SMART goal it tends to get glossed over. Most of us have a tendency to channel our inner Babe Ruth and swing for the grandiose aspirational outcomes. In higher education, this tends to take two forms. The first is in the unrealistic belief that most initiatives will have a direct effect on increasing persistence and graduation rates. Lots of factors go into whether or not a student persists and/or graduates, ranging from family and work obligations to their level of motivation and academic preparedness. Very few initiatives have the mass needed to directly move these metrics. Instead, focus on outcomes that you believe will be the direct result of the initiative. This also includes ensuring that the outcome and initiative are in alignment. One way to do this is visualize your outcome as a tree and your initiative as a saw. Is your saw proportionate to the size of the tree?  If you have a chainsaw to cut down a twig, your outcome is too meek given the initiative. If you have a handsaw to cut down a giant Sequoia, your outcome is too lofty given the initiative. Replace your inner Babe Ruth with your inner Zen and seek balance between the tree and the saw.
 Tip 3b: Unrealistic outcomes also come in the form of unachievable performance targets used to quantify them (e.g., [outcome] … will increase 5% over the next year). Take time to talk through what would need to happen for this to occur. For example, if the outcome and performance target is to increase enrollments in course X by Y% over two years, how many more students would need to enroll in course X? Based on the response to Tip 2, is it reasonable that the initiative will result in that? What factors might prevent this from happening (e.g., a drop in overall enrollment) and how likely are they to happen? One way to test how realistic the outcome is to ask yourself, “what percentage of my salary would I be willing to bet that the outcomes is achieved by the specified time frame?” If the percentage is low then you should consider revising the outcome to make it more realistic. It can also be very beneficial to set a range as the performance target rather than a single estimate to account for natural variability from year to year. Also, make sure you distinguish between a percent increase and an increase in percentage points. It may seem like a small nuance but increasing 20% by 10 percent points (20% –> 30%) is a lot different than increasing it by 10 percent (20% –> 22%).
 Tip 4: Write down any potential secondary or indirect outcomes of the initiative. There are outcomes that you do not expect the initiative to directly influence but might indirectly influence. In short, success on the initiative will not be determined by these things happening, but they are a nice additional potential outcome. Persistence an graduation rates tend to fall in this category. Do this by expanding the sentence in Tip 1 to, “If the initiative is successful, [direct outcome] is expected to happen, which in turn may lead to [indirect outcome] happening.” Indirect outcomes are like donuts on Friday. They are not essential to measuring the successfulness of the day, but are still worth vigorously pursuing.
 Tip 5: Make sure the metric selected to quantify the outcomes directly relates to the outcomes. Sounds straightforward, but it is easy to select the wrong metric given the outcome. My professional leaning is toward quantitative methods. There is something elegant about numbers and their utility. However, it is important to acknowledge that not all things should be evaluated in quantitative terms. Antoine de Saint-Exupéry eloquently reminds us of this in the below excerpt from his classic tale of The Little Prince:
“Grown-ups love figures. When you tell them that you have made a new friend, they never ask you any questions about essential matters. They never say to you, ‘What does his voice sound like? What games does he love best? Does he collect butterflies?’ Instead, they demand: ‘How old is he? How many brothers has he? How much does he weigh? How much money does his father make?’ Only from these figures do they think they have learned anything about him.”
 Figures are immensely beneficial and in many cases are the best method for evaluating an expected outcome. But do not let the method drive the need. Go with the approach that best fits the nature of the initiative under evaluation – but do not be deceived into thinking that pursing qualitative outcomes is an easier short cut to evaluating the success of an initiative. In many cases it is a much more difficult path. Ideally, most things should be evaluated from both perspectives.
 After enthusiastically conveying these tips, the VP, Dean, and Dept. Chair looked at me with puzzled faces and said, “So is this how research directors spend their Friday nights?”  I sheepishly slinked back to my table as I muttered to the group, “the next round is on me.”
For more information, please email us at spa@gccaz.edu.