[Ffmpeg-devel-irc] ffmpeg.log.20160630

burek burek021 at gmail.com
Fri Jul 1 02:05:01 CEST 2016


[00:31:16 CEST] <UnDeRsOuL> Hello, I´m recording the screen in my macbook, but it says that yuv420p is not supported by the device. How can I list the modes supported. And is ever going to be supported?
[00:32:21 CEST] <Mavrik> Can you pastebin the actual error?
[00:32:34 CEST] <Mavrik> Including your command-line and everything ffmpeg displays?
[00:32:47 CEST] <UnDeRsOuL> ok, wait
[00:37:45 CEST] <UnDeRsOuL> Mavrik: http://pastebin.com/BBEmMCU5
[00:38:34 CEST] <furq> the list of supported modes is shown right beneath that error
[00:38:38 CEST] <Mavrik> Well, it's telling you that your INPUT device (avfoundation) doesn't support that pixel format.
[00:38:40 CEST] <furq> or that warning, rather
[00:38:47 CEST] <Mavrik> So you should really ask Apple :P
[00:38:59 CEST] <Mavrik> But that's usually not a problem, what kind of problem are you really having?
[00:39:04 CEST] <furq> you probably just want to specify -pix_fmt yuv420p as an output option
[00:39:20 CEST] <furq> i can't imagine any other reason why it'd be an issue
[00:39:29 CEST] <UnDeRsOuL> furq, yes that works, I thought it was a ffmpeg problem
[00:39:39 CEST] <Mavrik> It's not a problem at all.
[00:39:42 CEST] <Mavrik> You input is in one pixel format.
[00:39:45 CEST] <Mavrik> Your video is in another.
[00:39:58 CEST] <Mavrik> It's how things are :)
[00:40:08 CEST] <UnDeRsOuL> I wont affect the quality of the output?
[00:40:14 CEST] <furq> no
[00:40:18 CEST] <Mavrik> It will.
[00:40:26 CEST] <furq> uyvy422 is higher quality than yuv420p
[00:41:02 CEST] <Mavrik> You're grabbing RGB screen into a video format that only does yuv420p on most devices.
[00:41:09 CEST] <Mavrik> You're losing color accuracy in that conversion somewhere.
[00:41:18 CEST] <Mavrik> But there's nothing you can do about it if you want to keep your video compatible.
[00:41:48 CEST] <Mavrik> It's a limitation of most H.264 video players.
[00:42:01 CEST] <furq> it might make more sense to capture in nv12 since that's probably less work for swscale
[00:42:08 CEST] <furq> it's not going to make a big difference though
[00:43:14 CEST] <UnDeRsOuL> ok thx guys =)
[00:43:41 CEST] <furq> you will definitely want to specify the output pixel format though
[00:43:57 CEST] <furq> if compatibility is an issue then use yuv420p
[00:46:35 CEST] <UnDeRsOuL> Does ffmpeg have mp4 version 2 as a container?
[00:57:04 CEST] <furq> i'm pretty sure it only has version 2
[00:57:10 CEST] <roxlu> hi! I'm extracting a keyframe from a flv stream with something like: ./extract_keyframe | ffmpeg -i - ....., but sometimes it takes too long and I was wondering if I can timeout ffmpeg somehow?
[01:14:40 CEST] <vandemar> roxlu: what about writing something that times out, and pipes stdin to stdout, and then doing ./extract_keyframe | timeout-pipe-prog | ffmpeg -i - ... ?
[01:57:13 CEST] <bp0> hello. I see that hdcd filter was added in 3.1, but there is no documentation. how is it used?
[02:01:19 CEST] <furq> -i 16bit.wav -af hdcd 20bit.wav
[02:03:11 CEST] <bp0> hmm, I tried that but the output was still 16bit
[02:06:03 CEST] <bp0> using HDCD16.flac from https://trac.ffmpeg.org/ticket/4441#no1 ...
[02:06:27 CEST] <bp0> I tried ./ffmpeg -i HDCD16.flac -af hdcd -o OUT.wav
[02:06:56 CEST] <bp0> and OUT.wav was nearly identical to the original, but about 30 bytes longer
[02:07:23 CEST] <bp0> *identical to the original after flac -d
[02:13:27 CEST] <bp0> ffprobe OUT.wav ---  Stream #0:0: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 44100 Hz, 2 channels, s16, 1411 kb/s
[02:34:00 CEST] <EngineerMode> It looks like Chrome uses FFmpeg in some capacity. Does it use FFmpeg for its primary video decoding?
[04:02:06 CEST] <Prelude2004c> hey guys.. question anyone know this error : Unsupported data layout: yuvj420p ..
[04:02:11 CEST] <Prelude2004c> do i have to set somethign on the input to get around this ?
[04:02:14 CEST] <Prelude2004c> vdpau is complaining
[04:02:26 CEST] <furq> pastebin the command and output
[04:08:50 CEST] <Prelude2004c> figured it out... thanx to turik ... can't use vdpau
[04:53:02 CEST] <strongcoffee> anyone know of a batch script progress bar for ffmpeg?
[04:53:34 CEST] <strongcoffee> not the default one that is, something simpler
[04:54:02 CEST] <strongcoffee> I've seen progress bars for other things but I'm not sure how it would be integrated with ffmpeg
[05:46:07 CEST] <HokarPokar> I am trying to grab a window using "ffmpeg -f gdigrab -framerate 30 -i title="Quake 3: Arena" -s 1280x720 -c:v libx264 -b:v 4M -pix_fmt yuv420p -preset ultrafast http://locahost:8084/feed1.ffm"
[05:46:25 CEST] <HokarPokar> But I get only white empty frames.
[05:47:52 CEST] <HokarPokar> But I tried grabbing the window of calculator application. It works. Doesn't work with the game. Any help ?
[05:50:22 CEST] <HokarPokar> Any help, please ?
[05:53:07 CEST] <furq> i doubt gdigrab will work with opengl windows
[05:53:09 CEST] <HokarPokar> Anyone online, ?
[05:55:34 CEST] <furq> doesn't ioquake3 support exporting demos to avi
[05:55:39 CEST] <furq> that's probably the easiest way to do it
[05:57:20 CEST] <HokarPokar> I tried outputting it to an mp4 file also. But still white frames.
[05:57:33 CEST] <furq> oh wait are you trying to live stream this
[05:57:35 CEST] <HokarPokar> Is this a problem with gdigrab itself ?
[05:57:45 CEST] <kepstin> no, with your graphics card driver
[05:57:54 CEST] <furq> like i said, i doubt it works with opengl or dshow
[05:57:55 CEST] <kepstin> it's just not returning the picture with that api
[05:58:00 CEST] <furq> er
[05:58:06 CEST] <furq> i doubt gdigrab or dshow will work with opengl
[05:58:24 CEST] <furq> i know x11grab doesn't work with opengl on linux
[05:58:44 CEST] <HokarPokar> But I am able to grab the window of the calculator application.
[05:58:45 CEST] <kepstin> hmm? last I checked, it x11grab can do opengl on linux
[05:58:55 CEST] <furq> the calculator isn't using opengl
[05:59:00 CEST] <furq> unless microsoft have actually gone insane
[05:59:13 CEST] <furq> i assume OBS supports this
[05:59:23 CEST] <furq> and you shouldn't be using ffserver anyway because it's unsupported garbage
[06:00:55 CEST] <HokarPokar> I am quite naive on all this. So ffmpeg tries to get the frames from opengl but opengl on windows doesn't have an API to return those frames ?
[06:00:58 CEST] <furq> kepstin: doesn't opengl normally bypass x11 entirely
[06:01:19 CEST] <kepstin> furq: not if you're using a desktop compositor
[06:01:37 CEST] <kepstin> I guess people who are using a classic wm might have issues capturing games tho
[06:01:54 CEST] <furq> oh
[06:01:59 CEST] <furq> it's been years since i used desktop linux
[06:02:18 CEST] <kepstin> HokarPokar: the way gdigrab works is it asks windows, via an old api that's been around for many years "please give me a picture of the screen" - it's then up to windows and your graphics driver to do it
[06:02:45 CEST] <kepstin> HokarPokar: if your graphics driver doesn't return a picture for some window, well, nothing we can do
[06:03:02 CEST] <furq> if you want to live stream quake then you probably want OBS and an rtmp server
[06:03:11 CEST] <HokarPokar> kepstin: Thanks.
[06:03:14 CEST] <kepstin> yeah, gdigrab is slow
[06:03:35 CEST] <furq> if you don't care about livestreaming then just record a demo and have ioquake3 render it to a video
[06:03:46 CEST] <furq> i remember that working quite well
[06:04:00 CEST] <kepstin> that said, you're using the window select mode - sometimes games do weird things with windows, so it's picking the wrong one. You can try doing a screen capture instead.
[06:05:16 CEST] <Guest10906> Can anyone help me with FFMPEG video conversion
[06:08:41 CEST] <HokarPokar> kepstin: I will try once with dshow. But it throws the error Unknown input format dshow. So dshow isn't supported
[06:08:41 CEST] <Guest10906> join #worldchat
[06:09:43 CEST] <kepstin> HokarPokar: try using 'desktop' instead of the window name with gdigrab, see if that works. But yeah, you probably want to use obs for this.
[06:12:22 CEST] <HokarPokar> kepstin: That works. But as you said, it grabs the entire desktop instead of a specific window.
[06:12:49 CEST] <kepstin> yeah, so it's probably just an issue where the game has multiple windows and the selection is picking the wrong one
[06:24:40 CEST] <furq> maybe ioquake3's video recording isn't as good as i remembered
[06:24:56 CEST] <furq> it splits the video every 2GB, which is roughly every five seconds at 1080p60
[06:25:37 CEST] <furq> it also does not play back very well when it's trying to write 400MB/sec to a hard disk
[06:26:30 CEST] <thebombzen> does anyone know the practical difference between yuv420p and nv12?
[06:27:08 CEST] <furq> nv12 is planar y and interleaved uv
[06:27:16 CEST] <furq> yuv420p is fully planar
[06:27:38 CEST] <furq> other than that they're identical
[06:33:49 CEST] <thebombzen> is there any reason to use one over the other?
[06:34:20 CEST] <furq> use whichever is supported
[06:35:11 CEST] <furq> if you've got nv12 coming from a capture device then most codecs will transform it to yuv420p anyway
[06:35:29 CEST] <thebombzen> oh okay. I'll stick to yuv420p then
[06:35:51 CEST] <furq> planar is generally better than packed (or half-packed)
[06:36:00 CEST] <thebombzen> also, do you have any recommendations on lossless screen-record codecs? I'm currently using utvideo
[06:36:13 CEST] <furq> not really
[06:36:19 CEST] <furq> ffv1 seems good but i've never used it for rgb
[06:36:38 CEST] <thebombzen> ffv1 encodes too slowly for me to screenrecord. I use it for other things
[06:37:09 CEST] <thebombzen> although when I say "lossless" I'm okay with using swscale. I just don't want to quantize
[10:10:35 CEST] <yagiza> Hello!
[10:11:15 CEST] <yagiza> Is it possible to set output format context options with av_opt_set_dict()?
[10:16:58 CEST] <ashwinmuni> have lot of videos which needs to be transcoded to multiple profiles currently using ffmpeg with some scripts, anyone can point me for a transcoding cluster
[10:18:27 CEST] <ashwinmuni> Maybe someone who have some experience in Distributed Multi-bitrate Video Transcoding
[10:30:39 CEST] <yagiza> For some reason av_set_options_string() works all right, but av_opt_set_dict() don't.
[13:44:23 CEST] <yagiza> Hello!
[13:51:22 CEST] <yagiza> Any ides about av_opt_set_dict()?
[13:51:30 CEST] <yagiza> ideas
[14:00:35 CEST] <imperio_> hi, I'm wondering: when I download 32/64 bits libraries for windows, do I need to install external libs as well?
[14:00:49 CEST] <imperio_> (I never developped on windows so no clue about how it works)
[14:13:16 CEST] <yagiza> imperio_, if you're about Zeranoe FFMpeg Builds, then no, you don't need.
[14:13:45 CEST] <yagiza> imperio_, Zeranoe FFMpeg Builds are statically linked.
[14:16:17 CEST] <imperio_> yagiza: even in DLLs?
[14:16:32 CEST] <yagiza> imperio_, yes
[14:16:40 CEST] <imperio_> damn, such a relief!
[14:16:45 CEST] <imperio_> thanks yagiza :)
[14:16:57 CEST] <yagiza> imperio_, you're welcome
[18:18:03 CEST] <Spring> is there anything worth reading for h264 encodings using ffmpeg?
[18:18:14 CEST] <Spring> besides the basic wiki here https://trac.ffmpeg.org/wiki/Encode/H.264
[18:19:05 CEST] <Spring> are there options worth adding outside the presets if I need a high quality encode optimized for still moving scenes?
[18:47:02 CEST] <HSKW> hello can anyone have lastet ffmpeg with libaac ?
[18:47:06 CEST] <HSKW> for windows
[18:47:23 CEST] <furq> if you mean fdk-aac then no
[18:47:30 CEST] <furq> builds with fdk aren't distributable
[18:49:08 CEST] <HSKW> in windows is possible to add .dll library on the same dir?
[18:57:13 CEST] <yagiza> HSKW, yes, it is
[18:58:05 CEST] <agrecascino> how do i encode something into g726
[18:58:14 CEST] <HSKW> ok what difference libfaac and libfdk
[18:58:15 CEST] <HSKW> ?
[18:58:59 CEST] <JEEB> fdk-aac is much better
[18:59:07 CEST] <JEEB> although now for LC-AAC you don't need external libraries
[18:59:09 CEST] <yagiza> agrecascino, what's the problem? Just use g726 as output audio codec.
[18:59:14 CEST] <agrecascino> nvm
[18:59:15 CEST] <agrecascino> got it
[18:59:20 CEST] <JEEB> you only need fdk-aac for HE-AAC
[18:59:36 CEST] <agrecascino> also
[18:59:42 CEST] <agrecascino> actually
[18:59:42 CEST] <agrecascino> wait
[18:59:43 CEST] <agrecascino> nvm
[19:06:26 CEST] <HSKW> i've installed libfdk-aac-1.dll
[19:06:37 CEST] <HSKW> right?
[19:06:51 CEST] <JEEB> it won't make a thing that doesn't have it linked work
[19:07:00 CEST] <JEEB> do you need HE-AAC?
[19:07:02 CEST] <JEEB> as in, encoding
[19:07:07 CEST] <HSKW> yes
[19:07:08 CEST] <JEEB> or do you just want to encode LC-AAC
[19:07:28 CEST] <JEEB> ok, if you need HE-AAC encoding, then build it yourself
[19:07:32 CEST] <JEEB> no other alternative
[19:09:37 CEST] <HSKW> i need for RTMP... i dont know if is better LC or HE in this case
[19:14:11 CEST] <JEEB> HSKW: if you are planning to go under 64kbps for stereo then it's HE-AAC
[19:14:22 CEST] <JEEB> otherwise I would pick LC-AAC
[19:14:45 CEST] <JEEB> and in that case you would be fine with the internal aac encoder which was vastly improved by atomnuker
[19:17:33 CEST] <HSKW> LC-AAC is the libfdk-aac-1.dll that i 've downloaded, right?
[19:18:08 CEST] <JEEB> no
[19:18:12 CEST] <JEEB> they are profiles of AAC
[19:18:35 CEST] <JEEB> HE is the one that rapes more but enables you to get a somewhat listenable result at <64kbps for stereo
[19:19:06 CEST] <JEEB> LC is the one that leaves the sound in a somewhat better state but doesn't compress as well (thus >=64kbps is my random arbitrary number for its usage)
[19:19:39 CEST] <JEEB> a good LC-AAC encoder is already included in FFmpeg itself so if you are going to use that, you don't need fdk-aac or anything else
[19:19:44 CEST] <JEEB> you just need a recent enough FFmpeg
[19:20:37 CEST] <HSKW> Ok, but what is it this libfdk ? LC or HE?
[19:20:55 CEST] <HSKW> i see in a lot of example use this codec
[19:21:29 CEST] <JEEB> that's because it is the best third party library encoder and that's because the internal one wasn't good enough until like october (?) last year
[19:21:36 CEST] <JEEB> so of course you will have it mentioned in examples
[19:21:45 CEST] <JEEB> libfdk-aac supports both LC-AAC and HE-AAC encoding
[19:23:56 CEST] <HSKW> ah ok,, many thanks.. i use ffmpeg-20160629-57d30fd-win64-static
[19:24:54 CEST] <HSKW> well how i can trascode in h264 700K and audio lc aac 96K, res 640x360 and send it to rtmp server? i dont know how to set the command line
[19:25:14 CEST] <HSKW> the input is from here: udp://@127.0.0.1:8888
[19:30:15 CEST] <Spring> http://ffmpeg.org/ffmpeg-utils.html#time-duration-syntax
[19:30:36 CEST] <Spring> it says here negative values are permitted yet it gives me errors when using the -ss -to format
[19:31:00 CEST] <Spring> eg: -ss 00:00:00.00 -to -11
[19:31:27 CEST] <Spring> I would expect it truncate from the beginning to 11s before the end
[20:19:32 CEST] <Spring> the wiki on seeking is a bit misleading
[20:19:52 CEST] <ChocolateArmpits> howso?
[20:19:56 CEST] <llogan> feel free to edit it
[20:20:03 CEST] <Spring> using -ss before -i results in very inaccurate timings
[20:20:23 CEST] <Spring> whereas placing -ss and -to after -i result in accurate timings
[20:20:31 CEST] <llogan> that can be expected depending on the input format and stuff n' junk
[20:21:07 CEST] <ChocolateArmpits> Spring: before -i works on the container level, after -i works on the stream level
[20:21:21 CEST] <ChocolateArmpits> but is much slower as the stream has to be decoded
[20:21:35 CEST] <ChocolateArmpits> I think
[20:22:06 CEST] <__jack__> "As of FFmpeg 2.1, when transcoding with ffmpeg (i.e. not just stream copying), -ss is now also "frame-accurate" even when used as an input option"
[20:22:20 CEST] <__jack__> (ffmpeg -ss .. -i ..)
[20:23:03 CEST] <Spring> that's what wasn't clear
[20:23:17 CEST] <Spring> it makes it sound as though it's frame accurate
[20:23:35 CEST] <__jack__> is not it ?
[20:23:59 CEST] <JEEB> -ss before -i is on demuxer level - it cannot be frame-accurate with formats where there are non-IRAP packets
[20:24:19 CEST] <JEEB> -ss after -i is on decoder level where all you have are decoded raw video and audio
[20:24:28 CEST] <JEEB> thus it can be fully frame-accurate in theory
[20:24:37 CEST] <__jack__> JEEB: so, the wiki is wrong ?
[20:25:29 CEST] <Spring> in one test the end was set to 00:00:14.559 but it actually ended closer to 00:00:17.528
[20:25:44 CEST] <Spring> using the -ss prior to -i syntax
[20:26:07 CEST] <Spring> reminds me of trimming via AviDemux
[20:26:07 CEST] <JEEB> __jack__: no idea what got fixed, but it is as accurate as something can be on the container level
[20:26:13 CEST] <__jack__> (tested it, it seems wrong)
[20:26:49 CEST] <JEEB> basically if you set a value that is after an IRAP it will go for the next one I think?
[20:27:07 CEST] <JEEB> because cutting on a non-IRAP makes no sense on container level
[20:27:20 CEST] <__jack__> what's IRAP ?
[20:27:26 CEST] <JEEB> Intra Random Access Picture
[20:27:35 CEST] <JEEB> aka something you can start decoding from
[20:27:41 CEST] <__jack__> I frame at the container level ?
[20:28:07 CEST] <JEEB> it is an I frame but additional limitations because you can't have references after it towards things before it
[20:28:26 CEST] <JEEB> also container level means that it has access to the encoded data but it doesn't decode
[20:28:35 CEST] <JEEB> so it can parse which packet is an IRAP and which is not
[20:29:12 CEST] <JEEB> just I frame means a picture that is full intra, but it doesn't promise you that you can decode what comes after it without pictures that came before that I picture
[20:29:32 CEST] <Spring> so basically the double quotes around "frame-accurate" in the wiki are very much intentional :p
[20:30:01 CEST] <JEEB> possibly -ss before -i was broken before, and now it should work as well as the demuxer works for a given format or so
[20:31:20 CEST] <JEEB> basically -ss before -i works with -c copy - after -i just won't work because nothing is decoded
[20:44:46 CEST] <Spring> on the -threads option, is it correct that specifying 8 threads is right for 2 cores, while 16 for 4 cores?
[20:47:05 CEST] <DHE> no, normally you would specify the number of cores (2 or 4) directly. maybe double for hyperthreading
[20:47:49 CEST] <Spring> I see. There doesn't seem to be a wiki page about it and various posters around the net have differing results
[20:48:19 CEST] <llogan> it depends on the encoder or decoder
[20:48:45 CEST] <llogan> for example, libx264 will automatically choose an appropriate value, so you don't need to use that option
[20:49:09 CEST] <Spring> any idea about libvpx?
[20:50:03 CEST] <furq> libvpx can't use 16 threads unless you're encoding 4k
[20:50:40 CEST] <llogan> i don't use that encoder but guess it is auto as well: "ffmpeg -h encoder=libvpx" shows "Threading capabilities: auto"
[20:51:42 CEST] <Spring> saw one stackexchange poster mention it can be automatic but it doesn't necessarily utilize all the cores
[20:51:58 CEST] <furq> yeah it uses slice-based multithreading
[20:52:13 CEST] <furq> the maximum number of threads it can use is floor(width / 256)
[20:52:46 CEST] <furq> i think there's also some other option you need to set correctly, maybe tile-columns
[20:54:20 CEST] <Spring> searching for tile-columns brought me to this https://github.com/Kagami/webm.py/wiki/Notes-on-encoding-settings
[20:54:47 CEST] <Spring> which mentions it can't auto detect the number of threads on a system so it has to be manually raised
[20:54:54 CEST] <furq> that looks right
[20:55:33 CEST] <Spring> bit confusing tbh. Is there some calculation for the right number of threads per core? Want to be able to tell users what they should set.
[20:56:05 CEST] <furq> not that i know of
[20:56:13 CEST] <furq> you'll rarely be able to use more than 7 threads anyway
[20:56:56 CEST] <Spring> if more are set via the option but the system doesn't support it will it throw an error or just use the max supported number?
[20:57:03 CEST] <furq> it'll just use as many as it can
[20:57:20 CEST] <Spring> okay, I'll prob just set a max number like 8
[20:58:21 CEST] <furq> http://permalink.gmane.org/gmane.comp.multimedia.webm.devel/2339
[20:58:26 CEST] <furq> i'm not sure if that's still accurate
[21:03:08 CEST] <Spring> from that github page "if your source is mostly static, you will have only one keyframe for the entire video"
[21:03:55 CEST] <Spring> so if I had say a video of a fixed camera and mostly still scene it would only add a single keyframe?
[21:09:41 CEST] <Spring> looks like there was a discussion on this on gmane.org
[21:59:47 CEST] <HSKW> JEEB: ffmpeg -i udp://@127.0.0.1:8888 -c:a libfdk_aac -b:a 96k -c:v libx264 -b:v 785K -f flv rtmp://
[22:00:01 CEST] <HSKW> on audio codec what i must use?
[22:51:04 CEST] <Spring> So this is strange. I'm encoding a test clip with different settings and comparing the results in Photoshop, when I notice the identical frame I'd copied from the media player minutes before has changed.
[22:51:47 CEST] <Spring> I then notice /all/ frame captures have become identical when previously they displayed differences
[22:52:05 CEST] <Spring> that is, when copying (or saving as PNG) the same frames again
[22:52:54 CEST] <furq> that is strange
[22:53:03 CEST] <furq> unless you've overwritten the old frames, in which case it is not strange
[22:54:05 CEST] <Spring> no,  they're all different layers and locked as soon as they're placed. I can't even get the same frame from the same video. What's an easy and accurate way to extract a frame?
[22:55:23 CEST] <furq> ffmpeg -i foo -ss 12.34 -frames:v 1 bar.png
[22:56:15 CEST] <furq> or `ffmpeg -i foo -vf select=eq(n\,1234) -frames:v 1 bar.png` if you want a specific frame number
[23:15:29 CEST] <Spring> thanks
[23:21:08 CEST] <Spring> Good. It must have just been an anomaly with the video player I was using (PotPlayer).
[23:21:19 CEST] <Spring> ffmpeg outputs the same result as I had initially.
[23:23:20 CEST] <Spring> Solved the mystery. Somehow must have toggled PotPlayer's denoise filter while Ctrl+C'ing. *facepalm*
[23:45:20 CEST] <HSKW> Error while decoding stream #0:1: Invalid data found when processing input
[00:00:00 CEST] --- Fri Jul  1 2016


More information about the Ffmpeg-devel-irc mailing list