Tag Archives: research

Is There Value in Having Students Do Collaborative Group Projects?

Collaborative group projects in online and hybrid classes – Is there value in having students do them?

I go back and forth with whether I should dump it or keep it. Students hate it, but I think there is value, and it’s a lesson students need to experience. Things don’t always go the way they should, and students can learn a lot from having to deal with this adversity.

I’ve been using a group project in my ENG102 hybrid course for about two years now, and I think it teaches students a lot about collaborating, working in a team, and sharing in the learning process with others. In the video below, I’ll share my process with you, as well as a few tools in Canvas that you may or may not be familiar with: Collaborations, Groups, Perusall and NoodleTools. 

Purpose: The purpose of the project is to teach students the process of writing an argumentative research paper. In groups of four the work through the whole process in four weeks. The only thing they don’t do is the actual research. I provide that for them. Let’s take a look, and I’ll show the tools as they are integrated into the process. 

Collaborative Group Projects in Canvas

Seeking Time-Turner

A time-turner, for all of you who have not read the Harry Potter book series, is a device that allows the wearer to travel back in time.

(http://harrypotter.wikia.com/wiki/Time-Turner)

As you can see it is also very fashionable. Hermione used the time turner to attend classes that occurred at the same time during her third year at Hogwarts.

If I had the chance to use a time-turner regularly, like Hermione, I would use it to research more at the community college level. Since completing my Ph. D. last semester I have missed researching classroom interactions. I find that between teaching full-time and being a new mom, I am stretch pretty thin when it comes to time.

I would love to have more time to improve our students mathematics classroom experience through research. I have colleagues in my department with NSF grants that fund their research and I am in awe of them teaching full course loads and conducting research.

This is where the time-turner would come in handy. I would teach my classes but then be able to turn back time and be in my office hard at work creating and implementing a research study of my design. I would also write journal articles that will help spread my findings to the community college and greater mathematics education community.

The benefit would be the chance to help mathematics instructors improve their teaching and in return help students in their mathematics classrooms achieve a better understanding of the concepts.

Since I will have to live without a time-turner for the foreseeable future, I plan to find some stability in my teaching load and work/life balance.

My current goal is to survive this first year as a residential faculty member with an overload and enjoy being a new mom. In the next year, I am planning to join one of the research groups that is already in my department. This will allow me to dip my toe back into the research pool. Eventually, I would like to be the one awarded an NSF grant to conduct research here at GCC.

 

Week 1: The One Thing You can do to Raise Enrollment

A six week “how-to” series
Week 1, Step 1: How to Impact Enrollment. But first, a story.

My biggest failure happened when I was a wet-behind-the-ears youth leader. I was actively looking to raise money for youth activities and I had responded to an ad pitching a T-shirt fundraiser. The company featured exciting, fun, faith-based designs on sleeveless T-shirts, and, for a limited time, was selling the shirts at a steep discount. The deal involved paying in advance with no returns and no refunds, but these things did not matter because these sleeveless shirts would sell themselves. I used my tax refund money to purchase the shirts. The shirts arrived and we began selling. But, instead of buying the shirts, our friends and families asked: Don’t you have any T-shirts with short sleeves? It turns out that people are so adverse to wearing sleeveless T’s that the fundraiser tanked horribly. It was a hard pill to swallow, but it changed my life.

I learned to never make decisions “based on a hunch.” I came to love data informed decision-making, and I am not alone. In this data driven age, even the youngest consumers are making informed decisions by comparing products, pricing, and reputation, including incoming college students and their families.

You’ve probably guessed by now, the “one thing” you can do is based on what works, study proven methods, and not gut instinct. So, what is the “one thing” you can do to influence the student decision-making process, raise enrollment, and raise GCC’s reputation in an increasingly crowded marketplace?

Before I spill the beans, you should know that conversely, by not doing this “one thing,” you risk falling off your potential students’ radar completely, and losing them to a competitor. There is a lot at stake and much to be gained.

The first step:

Go to www.gccaz.edu, and type your last name into the search box. Take a look at your employee biography webpage. What do you see?  If you were a student, is there anything on your page that would make you choose you?

What’s ahead:

WEEK 2: THE “ONE THING” AND ITS POWERFUL SWAY
When it comes to students choosing your classes, leaving choice up to chance is not your only option.

WEEK 3: THE “ONE THING” AND IT’S NOT BRAGGING
Reputation is king. Making your achievements public enables people to make informed choices.

WEEK 4: THE “ONE THING,” AND HOW TO INFLUENCE ASSUMPTIONS
Learn the top trait people assess when viewing strangers’ photos, and how your face, wrinkles and all, makes people choose you.

WEEK 5: The “One Thing” Before and After
If two faculty are each offering the same class, who would YOU choose?

WEEK 6: The “One Thing” and the Final Step

 

A VP, Dean, & Dept. Chair Walked Into a Bar…

By: Phil Arcuria, Research Director

A VP, Dean, & Dept. Chair walked Into a bar to discuss a new potential GCC initiative (where did you think I was going with this?). The initiative will require various types of resources and they want to make sure it “works”. They are discussing potential steps to take to evaluate the efficacy of the initiative. I happened to be sitting at the table next to them and, being a nosy neighbor, offered the following quick tips to help guide their efforts:
Tip 1: Write down the purpose of the initiative and the expected outcome(s). Verbally conveying it is not enough. Writing it down helps actualize it into something “real” that can more easily be refined and shared. One way to articulate the expected outcome is to complete the following sentence, “If the initiative is successful, ________ is expected to happen.” Or, a slight variant, “In order for this initiative to be deemed successful it must_______________.” Further refine the outcomes to capture the properties of SMART goals: Specific, Measurable, Achievable, Realistic, and Time-specific.
Tip 2: Substantiate in writing how you believe the initiative will result in the expected outcome(s). Be specific and detailed. Incorporate in prior research, best practices, how it has worked at other institutions, etc. If no prior research is available, lay out the logical argument on how the initiative will achieve the expected outcome. Pretend you are on an episode of ABC’s Shark Tank and you have two minutes to convince someone to invest the resources required for the initiative (e.g., employee time, funds, etc.). Craft your pitch and read over it. Does it provide a convincing argument of how the initiative will likely result in the expected outcome? If not, further refine it until can stand on its own. This step takes a lot of effort, but if we are not willing to put the effort into substantiating the value of the initiative, should we be asking anyone else to put effort into implementing it?
Tip 3a: Although Realistic is one of the characteristics of a SMART goal it tends to get glossed over. Most of us have a tendency to channel our inner Babe Ruth and swing for the grandiose aspirational outcomes. In higher education, this tends to take two forms. The first is in the unrealistic belief that most initiatives will have a direct effect on increasing persistence and graduation rates. Lots of factors go into whether or not a student persists and/or graduates, ranging from family and work obligations to their level of motivation and academic preparedness. Very few initiatives have the mass needed to directly move these metrics. Instead, focus on outcomes that you believe will be the direct result of the initiative. This also includes ensuring that the outcome and initiative are in alignment. One way to do this is visualize your outcome as a tree and your initiative as a saw. Is your saw proportionate to the size of the tree?  If you have a chainsaw to cut down a twig, your outcome is too meek given the initiative. If you have a handsaw to cut down a giant Sequoia, your outcome is too lofty given the initiative. Replace your inner Babe Ruth with your inner Zen and seek balance between the tree and the saw.
 Tip 3b: Unrealistic outcomes also come in the form of unachievable performance targets used to quantify them (e.g., [outcome] … will increase 5% over the next year). Take time to talk through what would need to happen for this to occur. For example, if the outcome and performance target is to increase enrollments in course X by Y% over two years, how many more students would need to enroll in course X? Based on the response to Tip 2, is it reasonable that the initiative will result in that? What factors might prevent this from happening (e.g., a drop in overall enrollment) and how likely are they to happen? One way to test how realistic the outcome is to ask yourself, “what percentage of my salary would I be willing to bet that the outcomes is achieved by the specified time frame?” If the percentage is low then you should consider revising the outcome to make it more realistic. It can also be very beneficial to set a range as the performance target rather than a single estimate to account for natural variability from year to year. Also, make sure you distinguish between a percent increase and an increase in percentage points. It may seem like a small nuance but increasing 20% by 10 percent points (20% –> 30%) is a lot different than increasing it by 10 percent (20% –> 22%).
 Tip 4: Write down any potential secondary or indirect outcomes of the initiative. There are outcomes that you do not expect the initiative to directly influence but might indirectly influence. In short, success on the initiative will not be determined by these things happening, but they are a nice additional potential outcome. Persistence an graduation rates tend to fall in this category. Do this by expanding the sentence in Tip 1 to, “If the initiative is successful, [direct outcome] is expected to happen, which in turn may lead to [indirect outcome] happening.” Indirect outcomes are like donuts on Friday. They are not essential to measuring the successfulness of the day, but are still worth vigorously pursuing.
 Tip 5: Make sure the metric selected to quantify the outcomes directly relates to the outcomes. Sounds straightforward, but it is easy to select the wrong metric given the outcome. My professional leaning is toward quantitative methods. There is something elegant about numbers and their utility. However, it is important to acknowledge that not all things should be evaluated in quantitative terms. Antoine de Saint-Exupéry eloquently reminds us of this in the below excerpt from his classic tale of The Little Prince:
“Grown-ups love figures. When you tell them that you have made a new friend, they never ask you any questions about essential matters. They never say to you, ‘What does his voice sound like? What games does he love best? Does he collect butterflies?’ Instead, they demand: ‘How old is he? How many brothers has he? How much does he weigh? How much money does his father make?’ Only from these figures do they think they have learned anything about him.”
 Figures are immensely beneficial and in many cases are the best method for evaluating an expected outcome. But do not let the method drive the need. Go with the approach that best fits the nature of the initiative under evaluation – but do not be deceived into thinking that pursing qualitative outcomes is an easier short cut to evaluating the success of an initiative. In many cases it is a much more difficult path. Ideally, most things should be evaluated from both perspectives.
 After enthusiastically conveying these tips, the VP, Dean, and Dept. Chair looked at me with puzzled faces and said, “So is this how research directors spend their Friday nights?”  I sheepishly slinked back to my table as I muttered to the group, “the next round is on me.”
For more information, please email us at spa@gccaz.edu.