Posts

Showing posts from May, 2021

How to Integrate Google Meet in Labvanced.com

Image
  We have now developed a solution that enables researchers on Labvanced.com to integrate Google Meet within your study! You can even combine that with other interesting features, like eye tracking, multi-user studies, or collaborative tasks within iFrames, and much more. There are a lot of interesting research options to explore, but how does it work? Let's assume the following example: The goal is to create a multi-user study where 2-3 people jointly engage in a task on Labvanced while they can communicate via Audio/Video chat. Each participant group should have its own call/ video chat. And here is how you do it: 1) Create a multi-user study on Labvanced.com 2) Create a series of Google calendar events with a Meet link/URL (one for each group of people, the time/ date doesn't matter) 3) Put those URLs into a string array and read them out such that people in the same multi-user group see the same link 4) Render the link as an URL so that participants can just click on the...

Now with Labvanced.com you can create your study in any language that you desire

Image
To accommodate the needs of the researchers in any country and the researchers conducting studies in multiple countries or with people from different cultures and backgrounds.  Labvanced now gives you the option to create your study in the language that you desire, which means not only you can choose the language that you desire but also all the static strings (System Generated Messages) can be in the language that you have chosen. The static strings are available in 6 different languages (English- French- Spanish- German- Portuguese - Chinese). These are only the languages that you can choose by a button click, but if you want a different language you can simply add it yourself. On top of that, labvanced provides a total solution for your psychological or UX study, and that means: 1-Cutting edge eye-tracking software is better than anything available in the market today. 2-Great and easy-to-use interface even complex studies. 3- Over 25 stimuli types and more than 200 templates f...

Video Recording of the Participant at Labvanced.com

Image
  Recently a new feature was added to the Labvanced research platform for psychological studies and user experience studies, which is video recording . Video recording can be done using the regular computer webcam or mobile camera, and the video recording session will be stored securely on our server. There are of course plenty of use cases for recording/obtaining video data. But to provide some ideas/ uses cases you can use the video recording data to: a) Analyze the emotional/subjective response towards certain stimuli or tasks. For instance, the video could post-hoc be analyzed with machine learning techniques to detect the face and consequently categorize the emotional response of the participants. b) be used as a control measure to see whether the participants were paying attention to the task, and also whether "environmental conditions" during the recording were appropriate (not too bright or dark, not too loud, alone in a room, etc.). So if you are a researcher in psyc...

A new state-of-the-art feature was added recently to Labvanced.com, which is data frames

Image
  A data frame is a table or a two-dimensional array-like structure in which you can combine multiple sets of images or different types of stimuli such as images, text, videos, and more, That way you can put all stimuli in your study into one place and easily read out a different stimulus (or set of stimuli) for each trial or frame.  The data stored in a data frame can be: numeric, text, Boolean, images, audio, and video data This will allow the researchers to: 1- Create a very complex study smoothly. 2-The data frame makes using our visual study builder easier and saves the researcher using our platform a great deal of time. 3- Allow the direct importing of stimuli or data from 2D CSV 4-The data frame makes it very easy to create complex randomization between the subjects. 5-You can easier associate stimuli (text, sound, images, videos) to each other. To test out these features yourself you can simply use this  template study from our open source experiment library:...

Labvanced.com compared to Millisecond and Qualtrics

Image
 Let’s start with a comparison to “Inquisit Web”. The most fundamental difference here, which is also the main advantage of Labvanced , is that Labvanced runs natively in any browser (Chrome, Firefox, Edge, Safari), and on any device (PC, Laptop, Mobile, Tablet) without the need for the participant to install a plugin. Most internet users and potential participants will not install the Inquisit plugin and hence with Labvanced you have access to a much larger audience / participant base with your experiment. The second main difference is that Labvanced offers a graphical UI editor to create your experiment, while still being able to implement the most advanced studies, with features, such as, automatic video audio presentation, audio (and soon also video) recordings, longitudinal studies, webcam based eyetracking, multi-participants experiments and much more. A third major advantage is that the Labvanced source code is open source  https://github.com/Labvanced/   and we ...

Using Labvanced.com to build a psychological game or multi-user study

Image
 The new Labvanced server provides an excellent platform for psychological multi-participant and joint-action studies in which multiple participants can interact in real-time. A multi-user study allows multiple users that are on different computers to participate in the same study simultaneously and share state between them. Participants in the study are connected through a real-time network connection. Synchronization is achieved via the exchange of variables. And a server-side handshake procedure prohibits race conditions and ensures common ground truth of states. With the Labvanced visual study builder, you can create a multi-user study or explore cooperative behavior, joint decision / joint action, create socio-economic games, and much more, fast and smoothly without the need to do the difficult coding or calculations yourself. Participants in the study will be able to communicate and interact with each other and with the study’s stimuli through our server. With the websocket...

How to connect Labvanced.com with external devices (EEGs, Eyetracking..)

Image
 Due to the increasing demand in feasibility checks and researchers asking us whether their study can be realized on  Labvanced , we decided to create a new format, in which we will explain how to implement a certain type of study, mostly focusing on advanced features. However, if anyone of you has a particular request on how to implement a certain type of study, please reach out to us, and we will consider it for one of the next posts.  So in this first episode here, we would like to explain how to connect  Labvanced  with external devices (such as EEG systems, eye trackers, forceplates, or other devices recording neuro-physical data). So in short, this is how you can run real lab experiments using  Labvanced . Most importantly, the technology used here is called “websockets”.  Websockets offer bidirectional data /network communication. You can enable websockets in the  Labvanced  study settings and then enter an IP address and a port. In ca...

Why do you choose Labvanced.com for your psychological research?

Image
   1- Labvanced offers a graphical task builder and highly advanced experiment features. Temporal accuracy / RT measures are millisecond precise and spatial accuracy is pixel perfect.  2-We are open source and offer over 200 studies/templates in our open access library . 3-We offer any kind of stimuli presentation and recording options, including audio and video. 4-We are fully mobile / touch compatible for smartphones and tablets. 5-We offer an own superior deep learning pipeline for webcam-based eye tracking. 6-You can create/run real-time multi-participant experiments (e.g. economic games) 7-You can create ANY kind of experimental logic (IF/Else, Loops, Callbacks own JavaScript). 8-Using our Web-Socket API you can connect Labvanced with other devices in real-time. 9-You can download and run Labvanced locally in your lab without internet access.

Labvanced.com The Best choice For Visual Perception Research

Image
  Labvanced.com ’s easy to use visual study builder includes all the features that any researcher who wants to study visual perception can need to conduct any kind of study, even the most complex one: 1- Best Eye Tracking available in the market today. 2- Head Tracking 3- Video recording of the participant screen (screen Capture) 4- Audio and video recording of the subject. 5- All the stimuli types that a researcher who wants to study visual perception can think about.  6-Multi-Users studies and Games. And many more other features. Labvanced.com has in its library already more than 50 published studies about Visual Perception. These are a few of them, just as an example: 1- Fairness Decision-Making https://www.labvanced.com/page/library/6953 2- Eye tracking Aesthetic Study https://www.labvanced.com/page/library/16548 3- Impact of Coloring Effect on Decisions https://www.labvanced.com/page/library/15592 4- Are all eyes the same? https://www.labvanced.com/page/library/15686 La...

Labvanced.com The Best choice For Auditory Perception Research:

Image
Trusted and tested by many of the big universities and Labs around the world, Labvanced.com is the best choice for a researcher who wants to study Auditory Perception. Labvanced.com offers: 1- Easy to use visual study builder, that helps you build even the most complex studies in no time. 2- Audio and video recording of the subject. 3- All the stimuli types that a researcher who wants to study Auditory Perception can think about.  4- Best Eye Tracking available in the market today. 5- Multi-Users studies and Games. And many more other features. Labvanced.com has in its library already more than 50 published studies about Personality. These are a few of them, just as an example: 1- Phrases and Problem-Solving in Music: Games, Learning and Memory, Auditory Perception. https://www.labvanced.com/page/library/5568 2- Perception and Expression of Musical Emotions: Auditory Perception, Visual Perception, Emotions. https://www.labvanced.com/page/library/5521 3- Vowel Perception: Auditor...

Now Labvanced.com Allows Recording of Screenshots (Screen Capture of The Participant’s Screen)

Image
  This new feature will allow the researcher to record what the participant sees and looking at during the study - combined with our new eye-tracking algorithm (which could be considered the best webcam eye-tracking available in the market today). Labvanced.com ’s platform is now the best choice for any researcher who wants to conduct a psychological or UX study that includes eye-tracking. With this new feature, you can overlay the recorded screen video with the responder’s gaze patterns, allowing deeper examination of variables and constructs beyond the traditional behavioral measurement.  Labvanced.com is also a fully integrated platform to host various psychological (e.g., Posner, SART, inattentional blindness, etc.) or UX study, despite the complexity of the research paradigm. Because labvanced.com has: 1- All the features that any psychology researcher can empirically investigate. 2- A visual study builder that is easy to use and will make you build the most complex stu...

Head Tracking New Feature From Labvanced.com

Image
  A new state-of-the-start feature was added recently to Labvanced.com platform which is Head Tracking via a regular computer’s webcam or phone’s camera . We believe this feature will be a great help to the researchers in the psychology field.  Head motion is physically complex and carries rich information. Head gestures including (nodding, shaking, tilting, tossing, dipping, thrusting, dropping, etc.) are also a way of conveying emotions. Importantly, compared to eye tracking head tracking doesn't need any calibration, and just works immediately The Head Tracking can be used to: 1- Show if the participant was facing the screen or his/her head was tilted left or right during the experiment. 2- During any experiment to show if the participant was paying attention and directing their face towards the screen or somewhere else. 3- Get metrics about where the face is relative to the screen, e.g. centered or to the left or right, and also a relative distance to the screen.  W...

New Cutting Edge Eye Tracking Technology

Image
  To evaluate Labvanced.com new cutting-edge eye-tracking algorithm, our research team has done several tests to determine the accuracy. The data shows that our new eye-tracking algorithm has a spatial accuracy of 1.4° horizontally 1.3° vertically .  Here is a link to the spatial accuracy-test study in Labvanced library: https://www.labvanced.com/page/library/12990 We believe that our eye-tracking algorithm is better than any other webcam-based eye-tracking technology and in fact really close to lab-based accuracy.  If you want to check it out yourself, we have several demo studies in our library : https://www.labvanced.com/page/library/11099 This is a performance test where the predicted gaze position will be displayed onto the screen, so you can test how good the prediction really is. https://www.labvanced.com/page/library/11327 This is a little game where you have to fixate objects to "destroy" them before they reach the bottom of the screen. https://www.labvanc...