How To and Technical Support
- 354 Topics
- 534 Replies
Does anyone know if Glint has the capability to change the default demographic breakdowns? Currently it gives us Team, Manager Hierarchy, and one more that we just aren’t that interested in viewing. I know you can manually change it once you are in the report, but we don’t want to put that hassle on all users.
We rolled out a pulse with a question where the “I absolutely agree” is actually a bad thing, it should be interpreted as a negative metric, but GLINT scores its (pretty high) score as blue in the reports. Is there a way to tell the software which end of the question has to be interpreted as positive/negative?
Dear Team, can anyone let me know how I can get access to previous VOE survey (the ones before October 2022) as I would need to get some comparisons to see improvements hopefully. I joined Honeywell in September 2022 and I can not manage to get access to these data. Thanks a lot. Stéphanie
I don’t understand how sentiment in the comment report is calculated. Is the sentiment based on the entire text of the comment for each open-ended question or each individual keyword or phrase found within a commenter’s response? For example, for a single question, there are only 3 commenters who wrote a single sentence each, but the % positive sentiment is 87% and the % negative sentiment is 13%. This calculation makes it seem like one sentence from a single commenter may have a stronger influence on the sentiment values than other commenters when in reality the commenter may only be using two redundant positive words to explain. Is there anyway to categorize sentiment based on the full text of the comment?
Hi,I have a scenario like this - 100 people with say an average score of 53. Let’s say I slice this 100 people to find out the number of managers and their score, I get 10 managers with an average score of 58 and the non managers are 90 with an average score of 53.The averages of 53 and 58 is 55.5 ( rounded off to 56 ) Now the difference is a delta of 3 between the full population & the average of the subpopulations ( 53 vs 56 ). The above scenario is something I see when I analyze the recent engagement survey. Could some one explain why the average scores of the sub populations do not match the average score of the full population ?
On this response in the detail view for Skip Manager is this a score of zero or a skipped question? I am not sure if this means the question was skipped or the score was low.It shows two dashes when you hover over, so I assume the question was skipped but when I look at the results details is says zero.
When adding sections to the report, new sections are automatically added to the bottom. This isn't always the best place for that section, and the only way we've been able to re-order sections is to delete them and then re-add them in an order that makes the most sense. Is there a feature that allows you to move a section up or down?
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.