[Mesa-dev] [PATCH 4/8] clover/llvm: Use -cl-std and device version to select language defaults

Aaron Watry awatry at gmail.com
Sun Aug 6 03:04:52 UTC 2017


On Sat, Aug 5, 2017 at 8:56 PM, Jan Vesely <jan.vesely at rutgers.edu> wrote:
> On Sat, 2017-08-05 at 19:46 -0500, Aaron Watry wrote:
>> On Fri, Aug 4, 2017 at 1:32 PM, Jan Vesely <jan.vesely at rutgers.edu> wrote:
>> > On Sun, 2017-07-30 at 20:26 -0500, Aaron Watry wrote:
>> > > According to section 5.8.4.5 of the 2.0 spec, the CL C version is chosen by:
>> > >  1) If you have -cl-std=CL1.1+ use the version specified
>> > >  2) If not, use the highest 1.x version that the device supports
>> > >
>> > > Curiously, there is no valid value for -cl-std=CL1.0
>> > >
>> > > Signed-off-by: Aaron Watry <awatry at gmail.com>
>> > > Cc: Pierre Moreau <pierre.morrow at free.fr>
>> > >
>> > > v2: (Pierre) Move create_compiler_instance changes to correct patch
>> > >     to prevent temporary build breakage.
>> > >     Convert version_str into unsigned and use it to find language version
>> > >     Add build_error for unknown language version string
>> > >     Whitespace fixes
>> > > ---
>> > >  .../state_trackers/clover/llvm/invocation.cpp      | 61 +++++++++++++++++++++-
>> > >  1 file changed, 60 insertions(+), 1 deletion(-)
>> > >
>> > > diff --git a/src/gallium/state_trackers/clover/llvm/invocation.cpp b/src/gallium/state_trackers/clover/llvm/invocation.cpp
>> > > index 7c8d0e738d..ca75596b05 100644
>> > > --- a/src/gallium/state_trackers/clover/llvm/invocation.cpp
>> > > +++ b/src/gallium/state_trackers/clover/llvm/invocation.cpp
>> > > @@ -93,6 +93,65 @@ namespace {
>> > >        return ctx;
>> > >     }
>> > >
>> > > +   unsigned get_language_version_from_string(const std::string &version_str){
>> > > +      if (version_str == "1.0"){
>> > > +         return 100;
>> > > +      }
>> > > +      if (version_str == "1.1"){
>> > > +         return 110;
>> > > +      }
>> > > +      if (version_str == "1.2"){
>> > > +         return 120;
>> > > +      }
>> > > +      if (version_str == "2.0"){
>> > > +         return 200;
>> > > +      }
>> > > +      throw build_error("Unknown/Unsupported language version");
>> > > +   }
>> >
>> > I'm a bit conflicted about this. returning int from device.cl_version()
>> > might be nicer, we are using C++ string so we probably don't have to
>> > worry about generating new strings all the time.
>>
>> While the device version could be changed to an integer, we still need
>> some string parsing, if we're going to detect the language version
>> specified in the build options. I could refactor out a
>> get_language_from_version_id(uint) and then use that in
>> get_language_from_version_str, which would at least let us keep the
>> device clc version as an integer the whole time, instead of needlessly
>> treating it as a string and then parsing it later.  But I do think
>> that we want to keep the string parsing piece, at least for now.
>>
>> >
>> > > +
>> > > +   clang::LangStandard::Kind
>> > > +   get_language_from_version_str(const std::string &version_str,
>> > > +                                 bool is_opt = false) {
>> > > +       /**
>> > > +        * Per CL 2.0 spec, section 5.8.4.5:
>> > > +        * If it's an option, use the value directly.
>> > > +        * If it's a device version, clamp to max 1.x version, a.k.a. 1.2
>> > > +        */
>> > > +      unsigned version = get_language_version_from_string(version_str);
>> > > +      if (!is_opt && version > 120 ){
>> > > +         version = 120;
>> > > +      }
>> > > +      switch (version){
>> > > +         case 100:
>> > > +            return clang::LangStandard::lang_opencl10;
>> > > +         case 110:
>> > > +            return clang::LangStandard::lang_opencl11;
>> > > +         case 120:
>> > > +            return clang::LangStandard::lang_opencl12;
>> > > +         case 200:
>> > > +            return clang::LangStandard::lang_opencl20;
>> > > +         default:
>> > > +            throw build_error("Unknown/Unsupported language version");
>> > > +      }
>> > > +   }
>> > > +
>> > > +   clang::LangStandard::Kind
>> > > +   get_language_version(const std::vector<std::string> &opts,
>> > > +                        const std::string &device_version) {
>> > > +
>> > > +      const std::string search = "-cl-std=CL";
>> > > +
>> > > +      for(auto opt: opts){
>> > > +         auto pos = opt.find(search);
>> > > +         if (pos == 0){
>> > > +            auto ver = opt.substr(pos+search.size());
>> > > +            return get_language_from_version_str(ver, true);
>> > > +         }
>> > > +      }
>> >
>> > I don't think you need the above. we only set the defaults, so clang
>> > should be able to parse this option on its own if we pass it along.
>>
>> Should... but yet, without setting the language defaults to CL 1.2
>> below, any kernel that uses the 'static' keyword fails to compile,
>> even with '-cl-std' set in the build options. Only when you invoke
>> set_lang_defaults with lang_opencl12 does it actually compile and run
>> properly. I guess it clang *could* be getting confused about the
>> language version if the program is specified as 1.2, and we are
>> explicitly setting the language defaults for 1.1.
>>
>> I guess I haven't tried to see what happens if you just omit the
>> set_lang_defaults call altogether, but I suspect that wouldn't end
>> well.
>>
>> >
>> > > +
>> > > +      return get_language_from_version_str(device_version);
>> > > +   }
>> > > +
>> > >     std::unique_ptr<clang::CompilerInstance>
>> > >     create_compiler_instance(const target &target,
>> > >                              const std::vector<std::string> &opts,
>> > > @@ -129,7 +188,7 @@ namespace {
>> > >        compat::set_lang_defaults(c->getInvocation(), c->getLangOpts(),
>> > >                                  compat::ik_opencl, ::llvm::Triple(target.triple),
>> > >                                  c->getPreprocessorOpts(),
>> > > -                                clang::LangStandard::lang_opencl11);
>> > > +                                get_language_version(opts, device_version));
>> >
>> > I'd imagine this could be something like
>> > get_language_from_version(std::max(dev.clc_version(), 120))
>>
>> But that would also set the default to a minimum of 1.2, while we
>> still only claim 1.1 support for all/most devices. I'm not quite read
>> to do that (although I'd love to).  It's not until we are convinced of
>> the API completeness, and at least not regressing anything that I'd be
>> happy with doing that. There's a few built-in functions that are still
>> missing that were added in 1.2 that we should probably finish up
>> before bumping the CLC version (or just make it auto-detect based on
>> capabilities and built-in library functionality). popcount would be an
>> easy built-in to deal with, printf() not so much.
>>
>> Also, that max() call would automatically upgrade all programs to CL
>> 2.x if the device supports it, which is explicitly not allowed.
>>
>> 1) If the program specifies cl-std, use exactly that version, as long
>> as the device supports it.
>> 2) If cl-std is not specified, use min(120, dev.clc_version()).
>>
>> The above, is basically being calculated by the get_language_version
>> call and that for loop in get_language_version.
>
> yes, it should have been std::min instead of std::max.
> Looks like I don't fully understand the point of set_lang_defaults. Is
> the language correctly set after
> "clang::CompilerInvocation::CreateFromArgs()"? We pass all the options
> there.
> Does passing "LangStandard::lang_unspecified"  instead of clc version
> change the behaviour?

Take a look at clang's CompilerInvocation.cpp:setLangDefaults, around line 1648.

If the language standard is lang_unspecified and input kind is CL, it
sets LangStd = lang_opencl10.

I just did a few tests on my machine after changing my device/platform
version to 1.2:

1) Leave LangStd at the old value of opencl11, static keyword fails
with "OpenCL version 1.1 does not support the 'static' storage class
specifier
2) Use my patches to detect -cl-std, static keyword gets through compilation.
3) Use LangStandard::lang_unspecified, static keyword fails with
"OpenCL version 1.0 does not support the 'static' storage class
specifier

>
> If you need to parse the string at least once then we can do it for
> dev.clc_Verions() as well.
> Maybe you can use something like
> "clang::LangStandard::getLangStandardForName()" to do the parsing?

I'm taking a look at that one.  We can take the device_clc_version
string and the cl-std string (e.g. "1.2") and pass that straight into
getLangStandardForName("cl"+dev.device_clc_version()), but that gives
us a LangStandard, not a LangStandard::Kind... so close, but yet so
far.

If you take a look at the implementation of getLangStandardForName,
they build a switch on strings based on the language name and a
macro/include. I guess it would be possible to suggest that clang add
in another method that provides the LangStandard::Kind instead of
immediately using that to look up the LangStandard itself.  If there's
a way to get the LangStandard from the Kind already, I haven't seen it
yet.

I guess it makes sense to try to use the same include file that clang
provides to get the list of language versions, and then switch on the
string value of the language. If we either get that added to clang as
another function, or implement something similar, at least we don't
have to update invocation.cpp again when CL 2.3 or 3.0 ends up getting
released.

--Aaron

>
> sorry for the lack of answers, I need to study the clang invocation API
> a bit more.
> I'm just trying to avoid duplicating what clang already implements, and
> ideally reduce clover handling of the clc language to minimum.
>
> thanks,
> Jan
>
>>
>> --Aaron
>>
>> >
>> > Jan
>> >
>> > >
>> > >        c->createDiagnostics(new clang::TextDiagnosticPrinter(
>> > >                                *new raw_string_ostream(r_log),
>
> --
> Jan Vesely <jan.vesely at rutgers.edu>


More information about the mesa-dev mailing list