/build/static/layout/Breadcrumb_cap_w.png

Scripting Question


TIME_TO_SEC appears to be incrorectly showing on reports

11/04/2014 2110 views

I have been working on a goal to have Kace Service Desk ticket reports to display total ticket times from both NEW to OPEN and NEW to CLOSED and I have run accross a problem.  When I use the TIME_TO_SEC function within a monthly report, out of 100 entries, 2 entries show the TIME_TO_SEC of the ticket, CREATED seconds is smaller number than the CLOSED.  I then noticed our times are off when adding up the total times for all the tickets in a month and comparing it to the report.  It gets really weird when I do a cum over several months, one entry which lies outside the criteria thay may show 34 days+.  I am trying to discover whether it is corrupt data or incorrect use of coding.  Here is the code I am working with.

SELECT
(case when
HD_TICKET.CREATED=HD_TICKET.TIME_OPENED then 'All Tickets'
when
HD_TICKET.CREATED<DATE_ADD(IF(HD_TICKET.TIME_OPENED='0',IF(HD_TICKET.TIME_CLOSED='0',0,HD_TICKET.TIME_CLOSED),HD_TICKET.TIME_OPENED),
INTERVAL 1 HOUR) then 'All Tickets'
when
HD_TICKET.CREATED<DATE_ADD(IF(HD_TICKET.TIME_OPENED='0',IF(HD_TICKET.TIME_CLOSED='0',0,HD_TICKET.TIME_CLOSED),HD_TICKET.TIME_OPENED), INTERVAL
24 HOUR) then 'All Tickets'
when
HD_TICKET.CREATED>DATE_ADD(IF(HD_TICKET.TIME_OPENED='0',IF(HD_TICKET.TIME_CLOSED='0',0,HD_TICKET.TIME_CLOSED),HD_TICKET.TIME_OPENED),INTERVAL
24 HOUR) then 'All Tickets'
end ) as CLOSE_GROUP,

count(HD_TICKET.ID) as NUMBER_OF_TICKETS,
CONCAT((sum(TIME_TO_SEC(TIMEDIFF(IF(HD_TICKET.TIME_OPENED='0',IF(HD_TICKET.TIME_CLOSED='0',0,HD_TICKET.TIME_CLOSED),HD_TICKET.TIME_OPENED),HD_TICKET.CREATED)))/count(HD_TICKET.ID)
div 86400),'d ',
TIME_FORMAT(SEC_TO_TIME(sum(TIME_TO_SEC(TIMEDIFF(IF(HD_TICKET.TIME_OPENED='0',IF(HD_TICKET.TIME_CLOSED='0',0,HD_TICKET.TIME_CLOSED),HD_TICKET.TIME_OPENED),HD_TICKET.CREATED)))/count(HD_TICKET.ID) mod 86400 ), '%kh %im %ss')) as AVG_TIME_TO_OPEN from HD_TICKET,HD_STATUS
where
HD_TICKET.HD_STATUS_ID=HD_STATUS.ID
and CREATED>='2014-10-01' /*change the start date here*/
and CREATED<='2014-10-31' /*change the end date here*/
and HD_TICKET.HD_QUEUE_ID in (1) /*add queue numbers here*/
group by CLOSE_GROUP

0 Comments   [ + ] Show comments

Comments

  • This content is currently hidden from public view.
    Reason: Removed by member request For more information, visit our FAQ's.

All Answers

0
This is actually a SEC_TO_TIME issue. It will only convert up to 838:59:59
Answered 11/04/2014 by: h2opolo25
Red Belt

  • This content is currently hidden from public view.
    Reason: Removed by member request For more information, visit our FAQ's.
  • h2opolo25, This is just one instance that I have noticed. It also happens in this code which does not do SEC_TO_TIME.

    SELECT HD_TICKET.ID, O.FULL_NAME AS OWNER_NAME, HD_TICKET.CREATED, HD_TICKET.TIME_OPENED, HD_TICKET.TIME_CLOSED, TIME_TO_SEC(TIMEDIFF( IF(HD_TICKET.TIME_OPENED='0', IF(HD_TICKET.TIME_CLOSED='0', 0 , HD_TICKET.TIME_CLOSED), HD_TICKET.TIME_OPENED), HD_TICKET.CREATED))/60 as Time_Diff_Min FROM HD_TICKET LEFT JOIN USER O ON (O.ID = HD_TICKET.OWNER_ID) where (HD_TICKET.HD_QUEUE_ID = 1) AND HD_TICKET.CREATED>='2014-10-01' and HD_TICKET.CREATED<='2014-10-31' so that leads me to believe it is the TIME_TO_SEC that is giving me the issue.

    Thanks
    • Looking at this second code you posted... I think the problem is that you put single quotes around the 0's for TIME_OPENED and TIME_CLOSED. If you really want to show it with single quotes you need to put '0000-00-00 00:00:00' or you can just write 0 without quotes.

Don't be a Stranger!

Sign up today to participate, stay informed, earn points and establish a reputation for yourself!

Sign up! or login

Share

 
This website uses cookies. By continuing to use this site and/or clicking the "Accept" button you are providing consent Quest Software and its affiliates do NOT sell the Personal Data you provide to us either when you register on our websites or when you do business with us. For more information about our Privacy Policy and our data protection efforts, please visit GDPR-HQ