Hi Nuria,

thanks for the reply. Am I correct in assuming you mean the channel irc://irc.freenode.net/wikidata, or is there a specific analytics channel?

Greetings,

Adrian


On 09/10/2017 03:29 AM, Nuria Ruiz wrote:
Probably irc is better suited for this 
type of question, you have an issue with conversion of types:

Caused by: java.lang.RuntimeException: Hive internal error: conversion
of string to array<string>not supported yet."


Selecting any of the hours that do not work brings out the same error, you can inspect the table data by looking at files in:  /wmf/data/wmf/wdqs_extract/year=2017/month=8/day=23/hour=9/ 
and try to find the records that have faulty data see if you can exclude them from your query.

You can get data with: hdfs dfs -cat /wmf/data/wmf/wdqs_extract/year=2017/month=8/day=23/hour=8/ and look at it with the parquet-tools command line. As I mentioned earlier probably pinging us on irc would be best.

On Sat, Sep 9, 2017 at 7:42 AM, Adrian Bielefeldt <Adrian.Bielefeldt@mailbox.tu-dresden.de> wrote:
Hello everyone,

I'm having trouble with the entries in wmf.wdqs_extract on 23.08.2017,
specifically the hours 9-17. The call

hive -e "insert overwrite local directory 'temp' row format delimited
fields terminated by '\t' select uri_query, uri_path, user_agent, ts,
agent_type, hour, http_status from wmf.wdqs_extract where
uri_query<>\"\" and year='2017' and month='8' and day='23' and hour='8'"

works fine, as do all hours in 0-8 and 18-23, but if I change hour to
anything between 9 and 17 it fails with the following message:

Error: java.lang.RuntimeException: Error in configuring
object

        at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)

        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)

        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)

        at
org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:449)

        at
org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)

        at
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)

        at java.security.AccessController.doPrivileged(Native
Method)

        at
javax.security.auth.Subject.doAs(Subject.java:421)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1796)

        at
org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
        ... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
        at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)
        ... 14 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
        ... 17 more
Caused by: java.lang.RuntimeException: Map operator initialization failed
        at
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:147)
        ... 22 more
Caused by: java.lang.RuntimeException: Hive internal error: conversion
of string to array<string>not supported yet.
        at
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters$ListConverter.<init>(ObjectInspectorConverters.java:313)
        at
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters.getConverter(ObjectInspectorConverters.java:158)
        at
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters$StructConverter.<init>(ObjectInspectorConverters.java:374)
        at
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters.getConverter(ObjectInspectorConverters.java:155)
        at
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters$StructConverter.<init>(ObjectInspectorConverters.java:374)
        at
org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters.getConverter(ObjectInspectorConverters.java:155)
        at
org.apache.hadoop.hive.ql.exec.MapOperator.initObjectInspector(MapOperator.java:199)
        at
org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:355)
        at
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:116)
        ... 22 more

Can anybode help me out with this?

Greetings,

Adrian


_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics



_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics