1. 0 INTRODUCTION On my first day in the establishment, I was taken to the Data processing department where I was to work for six months and was put in the category of the temporary workers. I was introduced to the heads of the data processing department and also most of the staffs that I was going to be working with or should I say trained with. It was a very challenging start for me working in a new environment and meeting absolutely different people. I realized that I still had to make use of my programming skills I learnt in school . This trainings will be explained as we go on in this chapters.
I was given a manual during the first week of my stay there to read and learn about the various applications I would be using and their terminologies in the departments. Some of these applications are : Dimensions SODA designer DM query SPSS statistics Survey reporter SQL Some terminologies I learnt and used and their meanings are : s/n TERMINOLOGY MEANING 1 questionnaires These are research work designed by the client to know what the market/ consumer think about a product in the market. 2 planning sheet Where the period or time frame of a project is laid out. 3 Project
Every job is called a project with the inclusion of the job name e. g. Project paste. 4 Interviewer The field worker assigned to a project. 5 puncher The data entry personnel. 6 Respondents An individual who enters a questionnaire for filling. 7 Variables These refer to each question in a questionnaire. 8 PAPI(Pen And Paper Interview) Respondent are expected to fill the questionnaires on their own. 9 CATI (Computer Assisted Telephone Interview) Respondents are called in this type of interview and their responses are recorded by an interviewer . 10 CAWI (Computer Assisted Web Interview)
Respondents answer the questionnaires online via the internet. 11 CAPI (Computer Assisted Personal Interview) Respondents are interviewed in person by an interviewer who records the respondents answer or a phone or computer 12 MDD Metadata These are some of the terminologies we use in our department to be able to differentiate or know a particular questionnaire and know which application would be used. After being through the terminologies and applications I would learn , I was shown the flow of projects in the department. From the DP manager to the Asst. DP manager who then decides the DP team that would
handle . Each team has an head analyst who appoints the project to a particular analyst to handle the project then communicates it to the C&S research executive handling the project for them to know who they would be working with directly if there be need for any form of modification in the questionnaire then to the Field if there is no more modification needed , from field to the E&C departments for editing then to the data entry department for punching then back to the DP team handling the project . this is the simple description in a flow chart. FLOW IN THE DATA PROCESSING DEPARTMENT
I would like to explain now the applications I worked on and with in . Above is the structure of a designed questionnaire sent by the Q&C dept. to the DP dept for the creating of a template. 1. 0. 1 DIMENSIONS Dimensions are market research software made by IBM. It has been renamed PASW (Predictive Analytics Software) but it is still called dimensions where I worked. It was used mainly for PAPI and CAWI interviews. Dimensions are of three file types the interview script (. mdd), MrScriptBasic (. mrs) and Data Management scripts (. dms) files. The Interview scripts (.
mdd) are used as I said earlier for creating interviews. It has a metadata section and one or more (user created) routing sections. The metadata section is used to define the questions that will be asked during the interview ,i. e. it defines the question you want to ask. The routing section is written in the MrScriptBasic and defines which of the questions will be asked during an interview and in what order they would be asked i. e. it defines how you want to ask them . the MrScriptBasic is a programming language that enables scripting access to the metadata components.
It was a very challenging start for me working in the new environment but thanks to my handler Mr. Dare who helped me in the understanding and explaining of the ways to use it by giving me projects to do . I realized that I still had to make use of my programming skills I learnt in school but in a different syntax and once you know the syntax it would be easy to create templates. This is an example of the metadata section of the dimensions And the routing section of the dimension script I was taught that just like every programming language there are rules governing how the templates are written . just like the : 1.
Naming Conventions : the names of the variables should begin with only letters and an underscore (_). 2. Reserve characters like double quotes(“”) , single quote(‘), ellipsis(.. ) etc. should not be used anyhow in your program. There are different data types like info , text , long , show , categorical etc. 3. Most importantly we have two categories of responses in questionnaires . Its either a single response question or a multiple response questions. For example a yes or no question is a single response question , responding to a questions like what are your hobbies could be a multiple response questions.
This makes it look all normal programs with syntax in it. Just as every other program has its syntaxes and semantics. So also does the dimensions have. We predefined functions and also user defined functions. Some predefined functions are : . Category. filter : this is used when most questions have the same categories and you want the second question to only display responses not selected at the first question. And this function is also used along side the definedcategories(). Tea. categories. filter = teas. response. value Ftea. ask () e. g. Teatried. categories. filter = teatried. definedcastegories () – teadrink.
response. value Teastried. Ask() .Ask : this is used to ask questions. i. e. talk. ask() .show : is to preview a questions . i. e. talk. show() .minivalue/maxivalue : Shows the minimum and maximum values i. e. Talk. validation. maxvalue = take. response. value Loop[.. ]. ask : This is a loop function that is used to ask each iteration on a separate screen. All questions on the iteration will be shown. i. e. talk. loop[.. ]. ask() DIM(Declare In Memory) : It is used to identify a new variable. i. e. dim cat .questionorder : The question should come in the other of which it is being asked. i. e. talk.
questionorder = Orderconstant. order .containsAny/containsSome/ContainsExactlyAll : It identifies the variables you want to partake of a particular task. Defined categories : Contains the variables in a question. i. e. Q5. categories. filter = definedcategories() – Q4. response. value oRandomize : To sort items in a random order. For each : It is used for categorical question classifying each variable in the question to be sorted. i. e. for each iter in outerblock .mustAnswer : This functions is used to classify if a question should be answered or not. It has to be either a true or false response.
If true then the question must be answered else it is false i. e. Question. MustAnswer = true if…then : this is used in decision questions/ conditions that aids in the decision making i. e. if Q5 . containsany ({_01}) then ……… Select…. Case : it is just like the if statement . but it is based on a single condition unlike the if statement that can take multiple conditions. The select case statement is cleaner. i. e. Select case Q5. response. value case 1 to 15 case 16 to 40 case else End select .Response. Value : it contains whatever value or response to a particular question.
.label : it is to show a tab in the running part of dimensions for easier punching for the data entry department. i. e. surveytitle. label. style. color = green These are some few predefined functions to mention a few. An example of how we write a particular questionnaire focusing on a question : Making a response list of these question. What are you favorite brands of cereals? Grape nuts Corn flakes Fruit loops IN DIMENSIONS: The mdd part: LINE 1 Metadata (en-GB , Question , label ) LINE 2 Favorites “What are your favorite brands of cereals” categorical [ 1 .. ] LINE 3 { LINE 4 _01 “grape nuts”,
_02 “Corn flakes”, _03 “Fruit loops” LINE 5 }; LINE 6 End Metadata The routing section : LINE 7 Routing ( web ) LINE8 Favorites. ask() LINE 9 End routing Whatever questionnaire that is to be written either in the metadata or routing section should be written in between the Routing ……. End routing or Metadata (en-GB , Question , label ) …. end metadata else it would produce a syntax error for you. Line 2 ‘Favorites’ is the variable name that would be used to represent the questions in the routing section . the words in double quotation are the words that would appear for punching.
The categorical[1.. ] indicates the data type of that question and the [1.. ] shows the number of responses to be answered 1 means you are to pick from 1 to whatever you want. Line 3 the open brace shows the beginning of a category Line 4 shows the response list of the question Line 5 the close brace shows the end of a response list. Below are few snap shot to show some examples of the dimension environment and how the questionnaires are being tested to see if it is fit for data punching. This is the routing section of a project. Project slingshot. . The MDD part of the questionnaire
These are the instance of the running environment, These is a full summary of all I learnt about the software dimensions. It is the most application used in the company for 80 percents of its project. 1. 0. 2 DM QUERY (Data Mining Query) This is an application used to clean and edit data gotten from punched questionnaires. i. e. After creating templates using the dimensions for a particular project, these template is sent to the data entry department (which is a sub- department under the data processing department) for use in punching the data gotten from each treated questionnaire .
When all the questionnaires are being punched , it is being collated and sent back to us the data analyst in the ddf formats. We the analyst use this DM query application to check for repetition , uncompleted or incorrect data for easy use. This application uses the SQL syntaxes in operating . syntaxes like WHERE, GROUP BY, SELECT, DELETE, ORDER BY etc. Some samples of these syntaxes are : 1. To check for the total number of duplicates in a set of data : select respondent_no, count(respondent _no) from vdata group by respondent _no having count(respondent_no) > 1 2. To see the Duplicated ID with the data :
select * from vdata where respondent_no=’23101′ 3. Updating and incomplete data or putting a new value to a variable : update vdata set respondent_no=’50001′ where Respondent. Serial=247 and respondent_no =’02602′ 4. To delete a data: delete from vdata where respondent_no=’23124′ and Respondent. Serial=247 5. Viewing the data by the respondent number : Select respondent_no, respondent. serial from vdata where respondent_no > 23125 and respondent_no < 23131 order by respondent_no 6. This is to check for uncompleted data : select count(respondent_no),datacollection. status from vdata where datacollection.
status={completed} order by respondent_no below are some snap shots to show the Graphic User Interface (GUI) of the application. 1. 0. 3 IBM SPSS STATISTICAL BASE It is a statistical analysis software that delivers the core capabilities you need to take the analytical process from start to finish. It is easy to use and includes a broad range of procedures and techniques to help you increase revenue, out perform competitors conduct research and make better decisions. It provides the essential statistical tools for every step of analytical process like : A comprehensive accurate analysis. Sophisticated functional and effective chart creation.
Clearly showing the significance of your findings It supports all forms of data sets The IBM SPSS has two parts / views : The Data view : it shows the cleaned data of the project that is to be used in drawing out reports or charts. It is either a true or false response in this view . i. e. it is either a 1 or 0 except for text variables that have its response written. The Variable View : this shows the variable names in the data and their data types. Every form of statistical analysis can be drawn from using the data, drawn from the dm query. From the chi square , frequency calculation, mean , standard deviation etc.
can be done using the SPSS. Some snapshots include : This is showing the data view of the project KLT wave. Tables drawn from the data gotten from the above data. the chart shows the frequency , percentage etc. Charts drawn from the data gotten. We have the bar chart and pie charts. 1. 0. 4 IBM SURVEY REPORTER It is a versatile reporting and visualization solution that enables you gain the most value from customer feedback and survey research. Simple and intuitive to use , it is designed for information consumers and survey researchers who are interested in interactive reporting and determining key insights from survey data.
It is more like the SPSS statistics base but more / basically used for charts . its charts are more explanatory and easily accessible and changeable. It is a more comprehensive way of getting tables and charts for the data collected .. 1. 0. 5 SODA DESIGNER SODA is also market research software. It was used mainly for CAPI and CAWI interviews. It is an internet based application that is more user friendly. In these type of application smart phones and tabs are what are used for punching data. It doesn’t really need programming only on rear occasions you have to write syntaxes.
It already has tools used in creating templates , just for an analyst to understand the use of all the tools so he/ she would be able to use the right tool for the right question in a questionnaire. SODA is more like designing a form then a programming application. It has two categories in it tools : The single category : the tools in these are used for only single response text/ questions . e. g. the single open ended text, categorical, subsection, date , time, yes/no tool etc. The multiple category : the tools in these are used for multiple response questions e.
g the multiple ended text, the multiple slide, grids etc. The use of the SODA designer is becoming more and more dominant that it is gradually erasing the use of Dimensions in the market research community . this is because of its user friendliness and easy method of retrieving data , i. e. data is received as the interview is going on live anywhere in the world due to the internet connectivity with the application. The only disadvantage of this application is that it is capital intensive due to the fact that you would need to buy smart phones and tabs to field officers to use, for their interviews.
Below are some snap shot of how the application looks like. CHAPTER 2 2. 0 KNOWLEDGE AND EXPERIENCES ACQUIRED In all ,through out my six months here I did a total of approximately 17 questionnaires most were from assisting my colleagues. Each of them made me learn new things and new and better ways of scripting. To mention a few of the projects I had challenges and it made me grow are : Project Juicy – this was like part of the first set of product test questionnaires I had to deal with. It had a lot of text oriented writings i. e.
whatever picked in a particular question is expected to show in the next question or other questions in the question , this was unfamiliar so it became a problem for me . so as at now I can boldly say this project gave me the experience of knowing how to do that. Project Excitement I would say was like one of the most confusing questionnaires I have ever done. Why? This was because it had too many conditions if you picked this code you jump two or more questions and all that . it was also a two in one question. It had both a main and a contact question, it was difficult for me to create an instance
of how to move from the contact to the main questionnaire. Project Spaghetti and macaroni had a whole lot of ranking in it, so from this I learnt how to create a function in dimensions and how to implement it where it is to be used Project slingshot is the longest and most challenging questionnaire I have ever written. It had six visits and had panels. That was the first time I dealt with a questionnaire that had to do with panels. I learnt through this project how to link almost a whole questionnaire in a loop and keep repeating the same questions till the number of times you allocated for it to stop.
I learnt how to restrict the dates for interviews , I surprisingly learnt how to use arrays in this project too. Sling shot had all the basic semantics programming languages had and all were implemented from the looping to the while, if statements to the arrays it was so fascinating. Project B&H forecast was like a repetition of the slingshot but had different formats and the way it appeared. The most thing I learnt here was when I was doing the evoke set part of the questionnaire where duplicates are to be removed leaving one or the regular brand been removed .
it gave me a new idea of scripting I had to settle down and think about how it was to be done tried different methods and in the process of trying came up with new ideas and tested it until I came up with a suitable one. I learnt how to make id cards for some field staffs and record them in the company’s database without creating a duplicate in them using SQL server. I learnt how to work under intense pressure : this had to do with working on two or more project ant the same time with due dates attached to each one of them . this taught me how to work efficiently and effectively under time constraints.
Like the MD of TNS- RMS Nigeria use to say we should not just work hard but work smart. I learnt how to clean data using the DM query as explained in the previous chapter without damaging the other data. This made me understand the meaning of data integrity ( refers to the validity of data and it can be compromised in a number of ways. ). It is important in the research business that data isn’t damaged or changed . it is a criminal offense that can put a data analyst in jail because the integrity of such data cant be viable to the client any longer .
you would be giving the client false predictions or charts if the data isn’t corresponding or is changed. So when data cleaning it must be done with rapt attention and full concentration to avoid mistakes. These few examples are some out of the numerous experiences have gathered and like they say experience is the best teacher . the experiences I came across in writing one or two questionnaires made it easier for me to write other new ones given to me at that particular period and I was able to make the next one better than the previous and thought of better and easier ways of creating templates.
Working with my handler Mr. Dare has been good. Although it had been challenging at first because most times he wasn’t very explanatory on the things needed to be done in a project and other requirements but that taught me how to grow and think outside the box in a project and not wait to be told what and what to do . I now understand how the research industry works and the challenges and things they go through so if I happen to be in a different research company I wouldn’t be a novice in the things and procedures of the company because of the exposure I happened to have gained during my SIWES .
I also learnt most importantly how to relate with my colleagues and higher authorities in the office. CHAPTER 3 3. 0 CHALLENGES FACED DURING THE SIWES PROGRAMME No matter how great an organisation is, it is bound to have its own shortcomings. Below are some of the challenges i faced during my internship with TNS-RMS. Network fluctuation makes it difficult for communication to be effective. Its either e-mails are not delivered immediately or end up not delivering at all cutting shot a lot of work.
The data processing department floor is too small for all the sub department placed in the floor making the place a bit too congested and look too small when its big, coupled with the fact that the AC in there is working below standard, thus making the floor hot and stuffy which isnt good for the communication and thinking. Most time due dates for project to be delivered to the data processing department are always not met making the work more tedious for us in the department in other to meet up with the overall due date given to the client.
I noticed there was like a preferential treatment for some particular department than others. Most times some departments do meetings that is meant for everyone before alerting other departments about the meeting which ought not to be so. Contract staffs are not given much attention there . they don’t give them any privileges. 3. 0. 1 RECOMENDATIONS Here are some recommendations I suggest should be adopted by TNS -RMS, data processing department in its future operations.
I recommend that the floor be increased in size and that the AC in the floor be changed to a new one to increase their efficiency. The due date should be followed extensively.. Departments should be given equal treatments even if staffs cant be given equal treatment. Contract staffs should be allowed a few privileges like going for some important IQ trainings organised for permanent staffs only . bearing in mind that some of them sooner or later would become permanent staffs therefore aiding their effectives in that field. CHAPTER 4 4. 0 OBSERVATION AND CONTRIBUTIONS
Working on a daily basis you are prone to observe things about an organization things that seems normal and difficult to change most especially in the department where I was posted to work in. There were four basic things I observed in the company where I did my SIWES . these include : Boss And Staff Relationship : the relationship between the major staffs and the ones in higher authority is very different they all act like they are in the same level , they give and accept free chatting relationship between each other if you have problems their doors are open and they are all generous .
No Discrimination : in these I mean in terms of religion . different types of people work in the company from Christians to Muslims , Buddhists e . t . c. Timeliness : every project given to a data analyst or any department what so ever to work on is timed. i. e. a date of submission is required for any project given to you by the manager. Work Friendly Environment : they create an environment where working would be something of joy. The resumption time is good and also the closing time which is 8:30am – 5:30 pm .
they organize get together to ease you from thinking of work all day, you have sporting activities and tournaments and least I forget the daily lunch time break which is an hour of relaxation daily. Like I said in the previous chapter , I helped in creating over 17 templates in which I delivered on time and did all the cleaning and briefing which helped the company to deliver the data of the clients to them on time most a times even before the due dates of the submission.
I made more work flow to the department because of the extra helping hand I added to the department so new jobs keep coming before the finishing of the ones at hand . they all were delivered and submitted at the appropriate and stipulated time to the necessary department that need it. I assisted some of my colleagues who found some task difficult to do due to the fact that they haven’t handled a project like that in which I have handled so they ask for assistance. so we all learn from ourselves. CHAPTER 5 5. 0 CONCLUSION
I would say, my internship in TNS- RMS has broaden my knowledge in programming that there are different ways it can be applied to make things easier and I have also learnt how to relate with people professionally and a host of others. I have become a better programmer and a big thinker of not doing things the normal way but finding ways to make that normal way better and greater with every program written. I know that no knowledge is a waste and I would later in the future be able to use whatever I have learnt from here.
Most of all what i have learnt, I find it has relevance with my course of study, with courses such as CSC312 (JAVA PROGRAMMING), system analysis and design and also SQL. I have also learnt a whole lot on research operations. Finally, SIWES has indeed afforded me the opportunity of acquiring enough practical skills thereby integrating the theoretical knowledge I acquired in the Classroom which then looked non-concrete to me and the practical knowledge obtained during the training. Having the opportunity of a working experience while still in school is a great privilege. Above all the whole program was a success.