Sunday, May 5, 2013

Cochlear Implants

            As a musician, the ability to hear is… crucial.  If pressed to name a deaf musician, most people would say Ludwig van Beethoven or another noteworthy composer or performer much later in their respective careers.  Very few could name someone who lost their hearing at a very young age.  This is because listening to and especially making music rely incredibly on hearing.  This is not to say that deaf people do not listen to or appreciate music.  In fact, the opposite is quite true; most people who are profoundly deaf can still enjoy music and dance because they can feel the music vibrations.  People who are deaf can also enjoy music through signed song.  For some, deafness is not a permanent condition.

            A cochlear implant is a surgically implanted electronic device that grants a person who is profoundly deaf, or severely hard of hearing, with a sense of sound.  Although the quality of sound may be different from natural hearing, patients are able to hear and understand speech, environmental sounds and even enjoy music!  The effects of this technology can be life-changing.  I cannot imagine what it must be like to sense sound for the first time, let alone hear music.  Below is a video of such an experience as a two-year-old hears his mother’s voice for the first time thanks to a cochlear implant.

Smart Board: An Amazing Teaching Tool

             As I reflected on my post about assistive technology and as I completed my final project, I was continually drawn to the Smart Board.  My first encounter with the technology was as a student in high school.  At the time, I was amazed at the sheer cool-ness of an interactive projection screen.  That being said, my teachers who were fortunate enough to have a Smart Board in their classroom rarely used them as more than a projection screen and saved the interactive capabilities for rare gimmick-driven occurrences.  Reflecting on the Smart Board through a pedagogical scope has both shown that my teachers missed an opportunity to enhance their lessons and informed my conclusion that Smart Boards deserve a place in almost any classroom.  The Smart Board is less of an accessory and more a valuable resource to enrich the learning environment.  Because it projects images, produces sound, and is interactive it collectively caters to visual, auditory and tactile learners.  How often can one technology provide so much?  An immediately example of the Smart Board’s application to music education that comes to mind is an adaptation of one of the technologies I referenced in my post about assistive technology.  I explained that some sort of Velcro board for placing notes on a staff would be an excellent example assistive technology.  The smart board could take this idea a step further by making the notes and staff completely digital.  This way, one could ultimately save or print whatever the student composes through this system.   I imagine that with the appropriate program, one could even have the student compose using the “drag and drop” idea and have the computer “perform” what they wrote.  This is only one example of the seemingly endless uses for this technology.

Final Project


 
            For this matrix, I analyzed a lesson for beginning recorder that I worked with in CURR 310 (Inclusion Module).  It is important to note that the unmodified plan was intended for 9th grade, but had unreasonably low expectations and objectives.  Therefore, I have treated the lesson as though it was intended for a 3rd grade general music class.  In this lesson, the students will learn basic recorder technique as well as how to read very basic notation.  The lesson culminates with group performances of “Hot Cross Buns” with teacher accompaniment on piano.
 1.         I will begin the lesson by introducing the instrument to the class.  After demonstrating the basic technique of how to hold the record and produce a sound, I will use the Smart Board to display an interactive recorder fingering chart.  Using the chart, I will explain the fingerings for the notes G, A, and B and demonstrate the way they sound.  At this point, students will have the opportunity to experiment with their instruments.  Based on NJCCC Music Standard 1.1.5.B.2, the students will begin to explore the elements of music as they apply to their recorder playing.  They will specifically focus on pitch and timbre at this point in the lesson.  Also, the students will be utilizing the interactive fingering chart (simulation) to explore the complex system of how to produce the appropriate sound on their instruments as per NETS 1.c.
 2.         As students become comfortable holding and operating their recorders, I will introduce basic notation for the three notes we have covered and also introduce rhythmic values for those notes.  Again, this will all be shown on the Smart Board.  After playing a simple melody from the Smart Board, student volunteers will have the opportunity to build their own melody (drag and drop) for the class using the notes and rhythms we have covered.  Next, I will create a new melody for the students to practice in small groups.  In these groups, students will use Garage Band (or similar recording software) to record and evaluate their own playing as they practice.  This step is aligned with NJCCC Music Standard 1.3.2.B.1 as the students will be “play[ing] on pitch from basic notation in the treble clef, with consideration of pitch, rhythm, dynamics, and tempo.”  Because of the Garage Band element, this step incorporates NETS 2.a. “Interact, collaborate… with peers… employing a variety of digital environments and media.”
 3.         In this step, students will be given sheet music for the song “Hot Cross Buns.”  I will facilitate group and individual practice of the piece through a variety of strategies.  We will continue to enrich the practicing by recording and evaluating using Garage Band and also utilizing a metronome to maintain time.  This step aligns to NJCCC Music Standard 1.3.2.B.6 as the students will be learning a more complex melody and working in groups and individually.  Students will be given the opportunity to use Garage Band again during this step as per NETS 6.b.
 4.         At this point, the class will finally perform the piece.  Students will perform in groups as I accompany them on the piano.  By using a MIDI keyboard, I can employ a variety of different sounds as well as provide a rhythmic accompaniment.  This is important for the purpose of variety but will also open up a dialogue about texture based on the keyboard sounds.  Again, students will “Interact, collaborate, and publish with peers, experts, or others employing a variety of digital environments and media.”  After successfully performing the song in groups, individuals will have the opportunity improvise (using the notes they are familiar with) over the basic harmonies of the song.  Improvisation is one of the more overlooked abilities in this sort of setting and is also featured in the national and state standards.  NJCCC Music Standard 1.3.2.B.5 indicates that students will “Improvise short tonal and rhythmic patterns over ostinatos, and modify melodic or rhythmic patterns using selected notes and/or scales to create expressive ideas.”
 5.         Finally, as a homework assignment, students will continue to practice the three notes covered in class by working with a parent or other adult.  Students will also need to continue to practice “Hot Cross Buns” at home as well.  Students can optionally record their performances using a portable recording device.  They will also be encouraged to play along with one of several prerecorded accompaniments that I will have uploaded to Youtube.  Students will continue to harness their skills at home utilizing the standards featured in previous steps.

Sunday, April 28, 2013

Assistive Technology for Music Education

            Assistive technology refers to assistive, adaptive, and rehabilitative devices for people with disabilities.  Personally, it only seems right that everyone have the opportunity to somehow participate in music; it is a human right.  Also, federal laws such as IDEA 2004, Section 504 of the Rehabilitation Act of 1973 and the Americans with Disabilities Act of 1990 ensure that students with special needs have equal access to education.  Therefore, it is no surprise that there is a wide range of resources available to increase the accessibility of music making.

            Based on the definition of assistive technology, one of the simplest examples that would help in music composition (or any context) is a pencil with a specialized grip for the student.  Another useful tool when teaching composition is a Velcro type board that allows students to physically place notes on a staff.  Musical instruments are an entire world of assistive technology.  These range from instruments such as the bells that are inherently accessible to students with motor skills issues to adaptations for existing instruments such as a trombone slide extender.  Further innovative examples include music reading devices that make music reading easier by magnifying and adjusting the brightness of scanned images.  These devices even include the ability to “write” on scores with a special stylus and have a foot pedal page turning option.  Another very interesting device in the Soundbeam, which maps body movement to sound production.  Regardless of how simple or advanced, assistive technology is a necessity in music education.

Jazz Festival Technology

            Two weeks ago, I attended the Princeton Jazz Festival.  It was a two day event held at Princeton High School that featured numerous middle school and high school jazz ensembles.  I went to the second night of the festival to see the high school bands.  While I was there, I was rather surprised by the extensive use of technology by both the performers and adjudicators.

            Some of the innovative technologies in use at the festival were the kinds that are always present in a contemporary big band.  This includes microphones and amps for the musicians on stage.  One of the most obvious technologies that I immediately noticed were the two large screens on either side of the stage with a live projection of the performers on stage.  Two cameras were stationed at either side of the stage and were operated by high school students throughout the performance.  I thought this was a great way to make the festival a cross-curricular event.  The most astonishing example of technology in use at the event, however, did not involve the students at all.

            Each of the three judges responsible for evaluating each group and their performance was equipped with a laptop and a smartpen.  Although a laptop is not particularly noteworthy, it is worth mentioning that the judges’ comments and scores were instantly sent to the control room where a fourth official collaborated and printed the results for the individual bands.  As for the smartpen technology, I still have not recovered from how cool it was!  The judges simply pressed the record button that was printed on their special notebooks and were able to take notes and simultaneously record verbal commentary; both of which were included in the packages given to each band at the end of the event.  It was expedient, efficient and very impressive!

Friday, April 26, 2013

Hooktheory

            Back in September, a friend of mine sent me a link to a blog titled “I analyzed the chords of 1300 popular songs for patterns. This is what I found.”  Although it is written in a somewhat conversational manner, the research and results posted by Dave Carlton are some of the most in-depth, serious and accessible about popular music available.  The first part of the study explores the popularity of certain chords; this includes the most popular keys pieces are written in followed by the most popular chords.  The second part of the study seeks to answer, based on statistics, “What chord should come next?”  This section takes any chord and provides the frequency that any other chord follows it based on the 1300 songs.  Since my original visit to the site however, the project has grown into a larger community of popular song analysis.

            This is a wonderful project with regard to music education; especially music theory.  Like almost any subject, one of the main factors that determines a student’s interest is relevance.  This is certainly true for music theory.  The Hooktheory website is a user friendly database that using music theory skills and applies them to understand popular music.  Not only is it simply interesting, it would also serve as an excellent teaching tool.

            You can learn more about Hooktheory and explore the analyses at: http://www.hooktheory.com/

Thinking Back to Music Technology

            After posting about the EAMIR project, I couldn’t help but think back to my experience in Music Technology class.  Almost all of the teaching apps available through EAMIR were created using a program known as Max/MSP/Jitter (which are the three programming languages).  We spent a majority of the second half of the semester focusing on Max.  I found the program and its capabilities to be captivating.  It is rather user friendly and relies heavily on MIDI (Musical Instrument Digital Interface) data.  For my final project, I chose to work with Max to develop an interactive begin trumpet method book.

            The early pages of the book cover the basics of holding the instrument and producing sound.  One of the fundamental aspect of producing sound on the trumpet is setting up your embouchure (the muscles around your mouth).  In order to demonstrate this, I made a patch that corresponded to a page in the book about the way your lips should look before playing the trumpet.  This consisted of a series of (rather silly) pictures of me demonstrating the wrong and right ways to accomplish this next to a live webcam stream of the student so that they can compare.  The second patch I made is called “Instant Accompaniment.”  As the name suggests, it provides the student with a basic accompaniment to practice along with.  The accompaniment is limit to drum tracks but student have the options of Latin, Rock, or Hip Hop in three different tempos.  It was fun and rewarding to use Max to create a resource I can utilize in the future.

EAMIR


            One of the most interesting and fun applications of innovative technology to music education is definitely the EAMIR project created by V.J. Manzo.  Manzo taught at Montclair State University when I took the course Music and Computer Technology in 2011 and it was an enlightening and enjoyable experience.  EAMIR stands for “Electro-acoustic Musically Interactive Room,” which is what Manzo referred to his classroom as when he taught K-12 music.  The project was born from two main obstacles: the diverse levels of musicianship among students and the ability to address the needs of students with disabilities.  Looking to technology as a resource, he developed a series of adaptive instruments to use in his classroom to facilitate learning goals.  Each of these instruments (primarily software instruments) were uploaded and available for students and their parents to download to use at home.  This developed into the project it is today.  EAMIR is now described as “an open-source music technology project involving alternate controllers, sensors, and adaptive instruments to facilitate music composition, performance, and instruction through a collection of interactive music systems. The EAMIR software apps have been implemented in classrooms, including special needs and disabilities populations, research projects, and composition/performance environments.”

            The apps use a variety of interfaces to work, each designed with the students in mind.  Many patches simply require a computer and a mouse, while others utilize touch screen computers, ipads, webcams and smart boards.  Some apps even feature the use of popular videogame controllers including Guitar Hero and Dance Dance Revolution.

            To learn more, visit the EAMIR website at: http://www.eamir.org/

Wednesday, April 24, 2013

Interactivity #5

https://docs.google.com/spreadsheet/ccc?key=0As_udrC-Q8endEdTajJTYV9xZmt5WWxnZEw5dHFDSGc&usp=sharing

            For this interactivity I had the opportunity to interview a high school music teacher from a district in North Jersey (he requested that the district not be disclosed).  Although the district has not implemented the NETS, he is somewhat familiar with what they are.  Early on in our interview, he began to explain that the NETS or any similar standards pose a major obstacle to his discipline.  As an instrumental music teacher, he directs the school’s bands and smaller ensembles and finds that the instrumental music classroom or rehearsal space is a difficult environment to incorporate technology-centered standards.  This seemed discouraging at first; but as I introduced the example Student Profiles for grades 9-12, we both began to discover some of the ways an ensemble already fulfills these standards.  For example, he immediately remarked that performing in any ensemble requires every member to take part in problem solving in authentic (performance) situations.  That said, this process generally occurs without the use of innovative technology.  I then asked if he ever has students do listening assignments that required them to write about a performance or recording they listened to.  He said that they certainly did and we simultaneously realized that those assignments also tie into the NETS (3.Select digital tools or resources to use for a real-world task and justify the selection based on their efficiency and effectiveness. (3,6)).  Once again, however, this standards can only be achieved if students are conducting their research on the internet.
            At this point, the teacher explained that another of the obstacles that would hold back the implementation of a system like this is money.  His district, like many, constantly struggles with financial issues and shrinking budgets that he believes would hold the school back from purchasing the necessary equipment and software to effectively use the NETS.  At this point, the tools at his student’s disposal include computers with internet access, cd players and projectors.  This lead me to my next question, “Are there any other programs throughout the school that may already be using NETS strategies although they have not officially been implemented?”  After a moment of thought, he excitedly realized that the school’s sign language classes actually use a system of webcams to broadcast the class to another high school in a neighboring district.
            This lead to a brief brainstorming session between the teacher and me about other means of implementing the strategies involved in NETS.  One of the ideas we collectively designed was to broadcast a performance live to another band (similar to the sign language class) for them to provide feedback and vice versa.  Another idea was to take listening assignments a step further to incorporate “2.Create and publish an online art gallery with examples and commentary that demonstrate an understanding of different historical periods, cultures, and countries. (1,2)”
            As I mentioned, I was rather surprised at first by how unfamiliar the teacher was with the NETS and that the district had not implemented them or even a similar program.  I am unaware of the number of districts in the area that have began to incorporate the NETS, but I had imagined that most had some system in place.  As a future educator, I would introduce the ideas of the NETS to other in my school by first pointing out the elements of it that are already being fulfilled; such as those discovered during the interview.  I believe this would be a good way to ease the transition into a full implementation of the strategies.  These standards (or a similar set) are essential for the times we live in.  Luckily, the NETS are still a relatively “young” program.  As time goes on, I imagine they will gain wide spread appeal.

Wednesday, April 17, 2013

Interactivity # 4

https://docs.google.com/spreadsheet/ccc?key=0As_udrC-Q8endEdTajJTYV9xZmt5WWxnZEw5dHFDSGc&usp=sharing


            I chose this particular lesson for a number of reasons.  At first, I was attracted to the lesson’s rather obvious use of technology.  The lesson relies heavily on technology and it is necessary for student to use it in multiple ways.  From that starting point, I was drawn in by the way students would be engaged in research, analysis of information as well as production.  The production element of the lesson crosses disciplines as the students must first complete a writing task before performing and recording for their final product.  Upon basic analysis of the lesson (as per the assignment), I gained even more confidence in my decision based on the great deal of student-centered strategies.  Throughout the lesson, students must think critically and make decisions regarding their final project.  I also particularly like that the final product is something that could easily be shared with peers and family members as well as the teacher.

            If there are any gaps between the goals, strategies and technologies used, they are very minor.  The class must individually research a musician of their choice using a computer with an internet connection.  Then they must write a report based on what they learned using a word processor.  The final stages of the project require the students to use multi-track audio software to record themselves reciting their reports and mix that with excerpts of recordings of their chosen artists.  One of the potential gaps is the fact that, based on previous lessons, students may not have the necessary skills to generate the final product using a program such as Garage Band.  That said, I imagine that issue is easily avoidable by planning appropriately.  The technologies references above and basically essential to achieving the curriculum goals.

Monday, March 25, 2013

Interactivity #3: Generating a State of the Art Inventory

            The group process for this interactivity was rather collaborative.  The seven of us were able to effectively and efficiently generate our inventory as a team without ever physically meeting in person.  We began with a group Facebook message to share our ideas, ask questions and organize the way we would handle the task.  We then virtually “met” as a group on our Google Docs spreadsheet to fill in what we had decided to include.  This activity was definitely authentically collaborative as we were able to discuss new ideas and obstacles together (via Facebook) while producing our final product.  Occasionally, the Google Docs environment proved to be challenging.  For example, if two users were simultaneously working in the same cell, one could accidentally overwrite the other’s entry.  Luckily, the collaborative process made those sorts of challenges easy to overcome.

            I believe the final inventory that we generated could be a useful reference for any music educator.  It is organized in an easy-to-use manner and features links for almost every technology listed (with the exception of general technological interfaces such as MIDI).  The vast range of technologies featured in our inventory is also worth noting.  In one spreadsheet, one can find technology as simple as a conductor’s baton or rhythm sticks and as advanced as protocols.  Through this engaging collaborative experience, I think we certainly created a worthwhile resource.

Tuesday, March 5, 2013

Thoughts About the Impact of the Radio


            After viewing the visual timeline of educational technology, I would have to say that the radio is the technology that influenced music education the most from 1900-1990.  The video mentions the WHA broadcasts starting in 1917 as the first medium used to widely broadcast music education programs; but the impact of the radio certainly expanded further than that.  As far as simply listening to music and exposure to new music go, one’s options were rather limited in the early 20th century.  Recording technology was ever-improving and commercial production of records opened up new worlds for music listening to those who could afford them as well as a record player.  With a radio, however, after that initial purchase, exposure to new music was seemingly endless.  Music education relies on a number of basic aspects including: student interest, listening and performing/producing.  Before a teacher has the opportunity to foster students’ interest in music, an external catalyst must light that spark.  Although I did not experience the revolution of the radio firsthand, I imagine it played an integral role in that process.  There is also the self-explanatory listening aspect that the radio expanded upon.
            Upon reading both chapter two of “Rethinking Technology in Schools” and the article “A Social History of Media and Technology in Schools,” I maintain that radio technology had the greatest impact on formalized music education between 1820 and 1990 for better and worse.  It is arguable that the television had the greatest technological impact on formalized education; but much of the reasoning for that argument can also be said for the radio (although decades earlier).  The radio set up the foundation for nearly every broadcasting technology that followed and is still a valuable resource to this day.  As I mentioned above, the radio creates listening and learning opportunities inside and outside of the classroom.  Not only was this a new technology to broadcast music, it also provided news about current events, dialogue on interesting subjects, and also changed the way advertisement worked.  Unfortunately, with such a breakthrough in technology come certain sacrifices as well as setbacks.
            First, it was not long before many “serious music” elitists began to criticize radio stations and their listeners for their lack of broadcasting “serious music.”  To a certain extent, these critics were correct; but that is not the case anymore.  As time went on, public and private radio stations began that featured (sometimes exclusively) what these critics refer to as “serious music” (which is better described as Western Art Music).  Otherwise, the major issue created by the radio and later perpetuated by the television is the listener/viewer’s lack of control.  The standardization and plugging of popular music on the radio is one of the issues musicologist Theodore Adorno frequently wrote about in the 1940s.  He explains that radio stations choose what music they play most frequently and listeners function under the assumption that the music they enjoy is what the radio will play.  Therefore, radio stations dictate what music the listeners will like.  Although this perspective unrealistically assumes that all listeners abandon their freedom, it is definitely based in truth.  Unfortunately, the music industry did and still does have a lot of control over listeners’ preferences.  That being said, listeners do have the power to change the station or walk away.  Therefore I believe that the positive impacts made by the radio far outweigh the perceived negatives.

Monday, March 4, 2013

Interactivity #2: The History of Technology in Schools


The radio changed the way students experienced music in the 20th century: for better and worse.

Wednesday, February 20, 2013

Interactivity # 1


            Technology plays a vital role in Olivia’s life in both her school work and day-to-day activities.  She, like most students, relies on technology as a source of communication, information and entertainment.  Her cell phone and access to a computer allow her to stay in touch with her friends via phone calls, texting, instant messaging, email and social networking.  This is rather similar to the kind of relationship to technology the students I will be teaching will have.  It is important to recognize these uses of technology as they can affect the way my students communicate with me as well as each other.  It is reasonable to imagine that students would use social networking (such as setting up a private group on Facebook) as a means of correspondence when working together on a project.  Olivia shares that she often spends four to five hours at a time on the computer.  It is encouraging to hear that much of that time is spent looking up facts and researching things she is unfamiliar with.  She is also always finding new ways to customize her Myspace page.  Olivia is clearly curious and uses her access to the internet to learn more.  If more students are equally curious and resourceful (which I imagine they are, living in the information age), it would be important to utilize that drive and strategy in the classroom.  Finally, Olivia has an iPod that she is almost never without.  As a music educator, ease of sharing music from teacher to student and vice versa is valuable.

            The students in the video “Learning to Change, Changing to Learn” are all beyond familiar with multiple forms of technology.  Like Olivia, these technologies all play an integral part in these students’ lives; in some cases, it is a major part of their identities.  It is interesting to note that the students in this video provide the links between technology and their school experiences themselves.  For example, the connecting gaming with coordination and communication or the use of the camera on a cell phone for school projects.  To summarize one of the student’s thoughts: their access to potentially endless information means they have to refine their decision making skills.  Also, once again these students are problem solving using the resources they have at their disposal.

            The three most influential communications technologies in my life are my cell phone, iPod touch and my laptop.  I would have to say that a combination of these three devices takes up a large majority of the ways that I learn new information.  My cell phone is ranked first because it is the one device that I always have on my person.  It is quick and efficient mobile communication.  I also keep my calendar on my phone.  This is one facet of the device that I do not think the students are quite as reliant on as I am.  Next, my iPod is equally portable and has a 32 Gb capacity.  It allows me to carry essentially all of the music I could need on any given day.  Also, as long as I have a wireless connection, it allows me the ability to look things up with ease, check my email, Facebook and more.  I believe that the students featured in these videos would also use an iPod similarly although they may not be as invested in the ability to check multiple email addresses.  Finally, my laptop is a combination of my other communication technologies and then some.  With a decent computer and internet connection, anyone has access to an endless means of creation and consumption and seemingly endless content.

Monday, February 4, 2013

About Me

My name is Bryan Stepneski, I am 22 years old and a senior studying music education. I have been playing the trumpet for over 12 years and completed my senior recital in early December. I love music, traveling and spending time with my friends and family. After graduating from MSU, I am interested in teaching instrumental music in either high school or middle school; although I am interested in general music as well.

I believe I am rather comfortable with technology; it is definitely a pleasant relationship. During this class, I would most like to learn strategies for integrating technology into the ensemble rehearsal environment. I am familiar and more than comfortable with things such as iTunes and music notation software but I would like to know other ways of incorporating technology into the rehearsal setting.