Posts tagged "grad-school"

Note:

At present, I write here infrequently. You can find my current, regular blogging over at The Deliberate Owl.

white laptop with printed papers and books

Writing is not a chore

Many students see writing as a chore. I finished this study and have great results, now I have to write up the paper, boo. I want to attend this workshop, but oh drat, they want me to write two pages about something relevant to the workshop.

Repeat after me: Writing is not a chore.

Writing may be difficult. You may struggle to explain your ideas coherently and concisely. You may be in a never-ending battle with proper English grammar.

Writing may be time-consuming. You may spend an hour agonizing over one paragraph. You may stay up all night trying to finish a two-page paper (not counting the hours spent trying to get the Latex formatting to work or wrangling Word).

Writing is not a chore.

Writing is practice

Writing is practice. Writing is a key means of communication -- in academia and in the rest of the world! Learning to write well will never hurt you and only help you.

Writing is planning. Writing is thinking. Writing is synthesizing.

Writing your ideas out with an eye for communicating them to others can help you see the flaws in your arguments, come up with new connections between ideas and fields, or generally help you organize your thoughts on a subject. Introductions and discussions are especially great for this, since these are the parts of a paper where you connect your work and your ideas to everyone else's.

But not all writing has to be super academic or for a specific purpose. Journals, notebooks, text files: you can jot down ideas about what you're reading and thinking about. Whatever that is. Review your notes periodically. You may see patterns. You may develop new research ideas or figure out themes in your interests.

Write a lot.

It isn't just me saying this. Multiple advisors have told me: Papers become chapters in theses. The act of writing can add rigor to your thinking. Write as you go. Don't just write it all at the end!


0 comments

a pen sitting on a pad of paper with two extra pens beside it

Communicating ideas

As a student, you need to learn how to explain your work to others.

Which is to say that you need to convince other people that they should care about what you do.

And that's all about the story you tell.

(This isn't a skill only relevant to students. It's relevant to most people. But I'm a student, and a lot of the people I interact with on a daily basis are students, so this is advice targeted at us.)

Tell a story

When you share your ideas and your work with others, you are creating a narrative. You are telling a story. The key thing is to tell a compelling story about your work and to frame your work so that it means something to your audience.

Start big. Situate your work in the larger context. The question you should answer is not what are you doing? The question you should answer is why should anyone care?

Find a big important thing people care about. Tell them how it impacts their lives. Then explain how your work is related to that big important thing.

For example. Say you are working on a robotic language learning companion for preschool kids. The robot is supposed to help them learn new words. Why do we care? Well, language and literacy are important for everything humans do. It's the primary means of human knowledge transfer! Language is super important. Plus, there's research showing that if we don't get enough language exposure early on (e.g., ages 3-5), it'll be hard to catch up in school later. Oh no! Language skills are important for academic and life success! But not everyone has those skills! Enter robot. This robot can help young kids develop language skills at a critical time, thus saving them from a life of misery and pain!

Or, you know, something less dramatic. But you get the idea. Situate your work in a larger problem. Then dive in and explain how what you're doing fits into the larger problem, even if it's just a tiny little piece of that larger problem.

Make your audience care. Tell them a story.


0 comments

question marks on a purple background

What are you doing with your life? Why?

Last year, I took a seminar for Media Lab PhD students. During one class, we pondered what questions we ought to be asking as we began our journey toward seemingly distant proposals and dissertations.

We asked questions about ourselves. About our research. Why we do what we do. How we can do what we do better. Who we care about. Our visions. Our passions.

We were given a handout with the following list to start us off:

13 Questions Every PhD Student Should Ask

compiled by Prof. Judy Olson, University of Michigan, for HCI graduate students.

  • What is the problem? What are you going to solve?
  • Who cares? Why do people care about this problem?
  • What have other people done about it?
  • Why is that not sufficient? What are the gaps and unanswered questions?
  • What are you going to do about it? (Approach)
  • What are you really going to do about it? (Methods)
  • What do you expect to find?
  • What did you find? (Findings)
  • What does this mean? (Conclusions)
  • So what? (Implications)
  • What are you going to do next?
  • Where are you going to publish?
  • What are you going to be doing in 5 years?

Then we had to brainstorm our own lists of questions. Here's what my seminar class came up with:

Questions from Media Lab PhD students in 2014

  • How are you going to use it in the real world?
  • How are you going to change people's lives?
  • Will other people use it?
  • What is the question or opportunity? Where have we not gone yet - where are the new frontiers?
  • What does your advisor think you should do?
  • Why is it not incremental? How are you changing the conversation?
  • What did you learn?
  • What do you want to learn?
  • Why would the world (or your grandmother) be excited about it?
  • How can other people build on your work?
  • How could you fail?
  • How do you define success?
  • What other skills should you be learning now?
  • How do you take in the right amount of criticism?
  • How do you work with others and collaborate?
  • Who do you want to share your work with?
  • Who should you interact with to learn more about your field?
  • What's the best way to share your research?
  • What's the best way to get media attention?

Then we got to see the questions brainstormed by students in previous years. Here's what they asked:

Questions from Media Lab PhD students in 2012

  • What am I interested in?
  • What do I want to learn?
  • How do I want to learn those things?
  • Why am I here?
  • Why me? What is my uniqueness to solve this problem?)
  • What special skills do I bring to this?
  • Why do this in an academic environment?
  • What is the solution (not the problem)?
  • What is my vision?
  • What is my passion?
  • Why now?
  • What are my "bets"?
  • Who do I want to work with?

Questions from Media Lab PhD students in 2011

  • Does a PhD enable me to accomplish my dreams? Is this what I want?
  • What am I passionate about?
  • How can I leverage resources around me?
  • What new activities can I enable (rather than problems I can solve)?
  • How can I most effectively impact the world?
  • Who should I choose as collaborators?

Questions from Media Lab PhD students in 2010

  • What is my field?
  • How can I balance my research with the rest of my life?
  • How do my strengths contribute to my chosen field?
  • Am I happy?
  • Do I have the right advisor to accomplish what I want?
  • Can I get this done in time? (Scope of work)
  • Do I have the right background for this - should I take additional courses?

Additional questions from Mitch Resnick

  • How will my work expand possibilities and opportunities for others?
  • What principles and values will guide my work?
  • Can I create a map showing how my work relates to what others have done/
  • Who could I collaborate with?
  • What are some compelling examples that highlight the important of this work?
  • What community do I want to be a part of?
  • Can I make progress on this problem through an iterative process?

A lot to think about.

Can you answer them all?


0 comments

a girl reaches her hand toward the face of a fluffy red robot, which sits on the table in front of her

Socially Assistive Robotics

This project was part of the Year 3 thrust for the Socially Assistive Robotics: An NSF Expedition in Computing grant, which I was involved in at MIT in the Personal Robots Group.

The overall mission of this expedition was to develop the computational techniques that could enable the design, implementation, and evaluation of "relational" robots, in order to encourage social, emotional, and cognitive growth in children, including those with social or cognitive deficits. The expedition aimed to increase the effectiveness of technology-based interventions in education and healthcare and to enhance the lives of children who may require specialized support in developing critical skills.

The Year 1 project targeted nutrition; Year 3 targeted language learning (that's this project!); Year 5 targeted social skills.

Second-language learning companions

This project was part of our effort at MIT to develop robotic second-language learning companions for preschool children. (We did other work in this area too: e.g., several projects looking at what design features positively impact children's learning as well as how children learn and interact over time.)

The project had two main goals. First, we wanted to test whether a socially assistive robot could help children learn new words in a foreign language (in this case, Spanish) more effectively by personalizing its affective/emotional feedback.

Second, we wanted to demonstrate that we could create and deploy an fully autonomous robotic system at a school for several months.

a boy sits at a table with a fluffy robot on it and leans in to peer at the robot's face, while the robot looks down at a tablet

Tega Robot

We used the Tega robot. Designed and built in the Personal Robots Group, it's a squash-and-stretch robot specifically designed to be an expressive, friendly creature. An Android phone displays an animated face and runs control software. The phone's sensors can be used to capture audio and video, which we can stream to another computer so a teleoperator can figure out what the robot should do next, or, in other projects, as input for various behavior modules, such as speech entrainment or affect recognition. We can stream live human speech, with the pitch shifted up to sound more child-like, to play on the robot, or playback recorded audio files.

Here is a video showing one of the earlier versions of Tega. Here's research scientist Dr. Hae Won Park talking about Tega and some of our projects, with a newer version of the robot.

A fluffy red robot sits behind a tablet, which is laying on a table

Language learning game

We created an interactive game that kids could play with a fully autonomous robot and the robot’s virtual sidekick, a Toucan shown on a tablet screen. The game was designed to support second language acquisition. The robot and the virtual agent each took on the role of a peer or learning companion and accompanied the child on a make-believe trip to Spain, where they learned new words in Spanish together.

Two aspects of the interaction were personalized to each child: (1) the content of the game (i.e., which words were presented), and (2) the robot's affective responses to the child's emotional state and performance.

This video shows the robot, game, and interaction.

scene from a tablet app showing a toucan looking at things in a bdroom: a suitcaes, a closet, shirts, balls, a hat

Study

We conducted a 2-month study in three "special start" preschool classrooms at a public school in the Greater Boston Area. Thirty-four children ages 3-5, with 15 classified as special needs and 19 as typically developing, participated in the study.

The study took place over 9 sessions: Initial assessments, seven sessions playing the language learning game with the robot, and a final session with goodbyes with the robot and posttests.

We found that child learned new words presented during the interaction, children mimicked the robot's behavior, and that the robot's affective personalization led to greater positive responses from the children. This study provided evidence that children will engage a social robot as a peer over time, and personalizing a robot's behavior to children can lead to positive outcomes, such as greater liking of the interaction.

a girl mimics the head tilt and expression shown by a fluffy robot

Links

Publications

  • Kory-Westlund, J., Gordon, G., Spaulding, S., Lee, J., Plummer, L., Martinez, M., Das, M., & Breazeal, C. (2015). Learning a Second Language with a Socially Assistive Robot. In Proceedings of New Friends: The 1st International Conference on Social Robots in Therapy and Education. (*equal contribution). [PDF]

  • Kory-Westlund, J. M., Lee, J., Plummer, L., Faridia, F., Gray, J., Berlin, M., Quintus-Bosz, H., Harmann, R., Hess, M., Dyer, S., dos Santos, K., Adalgeirsson, S., Gordon, G., Spaulding, S., Martinez, M., Das, M., Archie, M., Jeong, S., & Breazeal, C. (2016). Tega: A Social Robot. In S. Sabanovic, A. Paiva, Y. Nagai, & C. Bartneck, Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction: Video Presentations (pp. 561). Best Video Nominee. [PDF] [Video]

  • Gordon, G., Spaulding, S., Kory-Westlund, J., Lee, J., Plummer, L., Martinez, M., Das, M., & Breazeal, C. (2016). Affective Personalization of a Social Robot Tutor for Children's Second Language Skills. Proceedings of the 30th AAAI Conference on Artificial Intelligence. AAAI: Palo Alto, CA. [PDF]

  • Kory-Westlund, J. M., Gordon, G., Spaulding, S., Lee, J., Plummer, L., Martinez, M., Das, M., & Breazeal, C. (2016). Lessons From Teachers on Performing HRI Studies with Young Children in Schools. In S. Sabanovic, A. Paiva, Y. Nagai, & C. Bartneck (Eds.), Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction: alt.HRI (pp. 383-390). [PDF]


0 comments

a pair of bright, fluffy dragon robots sitting beside each other on a table

Social robots as language learning companions for children

Language learning is, by nature, a social, interactive, interpersonal, activity. Children learn language not only by listening, but through active communication with a social actor. Social interaction is critical for language learning.

Thus, if we want to build technology to support young language learners, one intriguing direction is to use robots. Robots can be designed to use the same kinds of social, interactive behaviors that humans use—their physical presence and embodiment give them a leg up in social, interpersonal tasks compared to virtual agents or simple apps and games. They combine the adaptability, customizability, and scalability of technology with the embodied, situated world in which we operate.

The robot we used in these projects is called the DragonBot. Designed and built in the Personal Robots Group, it's a squash-and-stretch robot specifically designed to be an expressive, friendly creature. An Android phone displays an animated face and runs control software. The phone's sensors can be used to capture audio and video, which we can stream to another computer so a teleoperator can figure out what the robot should do next, or, in other projects, as input for various behavior modules, such as speech entrainment or affect recognition. We can stream live human speech, with the pitch shifted up to sound more child-like, to play on the robot, or playback recorded audio files.

Here is a video showing the original DragonBot robot, with a brief rundown of its cool features.

A child and a woman sit in front of a small table, looking at and talking with two fluffy dragon robots that are on the table

Social robots as informants

This was one of the very first projects I worked on at MIT! Funded by an NSF cyberlearning grant, the goal of this study and the studies following were to explore several questions regarding preschool children's word learning from social robots, namely:

  • What can make a robot an effective language learning companion?
  • What design features of the robots positively impact children's learning and attitudes?

In this study, we wanted to explore how different nonverbal social behaviors impacted children's perceptions of the robot as an informant and social companion.

We set up two robots. One was contingently responsive to the child—e.g., it would look at the child when the child spoke, it might nod and smile at the right times. The other robot was not contingent—it might be looking somewhere over there while the child was speaking, and while it was just as expressive, the timing of its nodding and smiling had nothing to do with what the child was doing.

For this study, the robots were both teleoperated by humans. I was one of the teleoperators—it was like controlling a robotic muppet!

Each child who participated in the study got to talk with both robots at the same time. The robots presented some facts about unusual animals (i.e., opportunities for the child to learn). We did some assessments and activities designed to give us insight into how the child thought about the robots and how willing they might be to learn new information from each robot—i.e., did the contingency of the robot's nonverbal behavior affect whether kids would treat the robots as equally reliable informants?

We found that children treated both robots as interlocutors and as informants from whom they could seek information. However, children were especially attentive and receptive to whichever robot displayed the greater nonverbal contingency. This selective information seeking is consistent with other recent research showing that children are, first, quite sensitive to their interlocutor's nonverbal signals, and use those signals as cues when determining which informants they question or endorse.

In sum: This study provided evidence that children show sensitivity to a robot's nonverbal social cues, like they are with humans, and they will use this information when deciding if a robot is a credible informant, as they do with humans.

Links

Publications

  • Breazeal, C., Harris, P., DeSteno, D., Kory, J., Dickens, L., & Jeong, S. (2016). Young children treat robots as informants. Topics in Cognitive Science, pp. 1-11. [PDF]

  • Kory, J., Jeong, S., & Breazeal, C. L. (2013). Robotic learning companions for early language development. In J. Epps, F. Chen, S. Oviatt, & K. Mase (Eds.), Proceedings of the 15th ACM on International conference on multimodal interaction, (pp. 71-72). ACM: New York, NY. [on ACM]

Word learning with social robots

We did two studies specifically looking at children's rapid learning of new words. Would kids learn words with a robot as well as they do from a human? Would they attend to the robot's nonverbal social cues, like they do with humans?

Study 1: Simple word learning

This study was pretty straightforward: Children looked at pictures of unfamiliar animals with a woman, with a tablet, and with a social robot. The interlocutor provided the names of the new animals—new words for the kids to learn. In this simple word-learning task, children learned new words equally well from all three interlocutors. We also found that children appraised the robot as an active, social partner.

In sum: This study provided evidence that children will learn from social robots, and will think of them as social partners. Great!

With that baseline in place, we compared preschoolers' learning of new words from a human and from a social robot in a somewhat more complex learning task...

Two panels: In the first, a child looks at a dragon robot, which looks at her while saying a word; in the second, the child watches the robot look down at a tablet

Study 2: Slightly less simple word learning

When learning from human partners, children pay attention to nonverbal signals, such as gaze and bodily orientation, to figure out what a person is looking at and why. They may follow gaze to determine what object or event triggered another's emotion, or to learn about the goal of another's ongoing action. They also follow gaze in language learning, using the speaker's gaze to figure out what new objects are being referred to or named. Would kids do that with robots, too? Children viewed two images of unfamiliar animals at once, and their interlocutor (human or robot) named one of the animals. Children needed to monitor the interlocutor's non-verbal cues (gaze and bodily orientation) to determine which picture was being referred to.

We added one more condition. How "big" of actions might the interlocutor need to do for the child to figure out what picture was being referred to? Half the children saw the images close together, so the interlocutor's cues were similar regardless of which animal was being attended to and named. The other half saw the images farther apart, which meant the interlocutor's cues were "bigger" and more distinct.

As you might expect, when the images were presented close together, children subsequently identified the correct animals at chance level with both interlocutors. So ... the nonverbal cues weren't distinct enough.

When the images were presented further apart, children identified the correct animals at better than chance level from both interlocutors. Now it was easier to see where the interlocutor was looking!

Children learned equally well from the robot and the human. Thus, this study provided evidence that children will attend to a social robot's nonverbal cues during word learning as a cue to linguistic reference, as they do with people.

Links

Publications

  • Kory-Westlund, J., Dickens, L., Jeong, S., Harris, P., DeSteno, D., & Breazeal, C. (2015). A Comparison of children learning from robots, tablets, and people. In Proceedings of New Friends: The 1st International Conference on Social Robots in Therapy and Education. [talk] [PDF]

  • Kory-Westlund., J. M., Dickens, L., Jeong, S., Harris, P. L., DeSteno, D., & Breazeal, C. L. (2017). Children use non-verbal cues to learn new words from robots as well as people. International Journal of Child-Computer Interaction. [PDF]


0 comments