Welcome to Release Readiness Live.
It is important to watch Release
Readiness live so that you get a snippet
of the coming attractions
of the functionality
that's coming from Salesforce.
We really focus on the features
that are going to be most useful
and impactful with release readiness.
Why have you got the great opportunity
to actually hear it from the mouths of
the people who spend a lot of time
thinking about what exactly you all are?
Salesforce developers
and admins are actually wanting to see
your feedback is so important.
I think that's
one of the most amazing things
actually about the Salesforce
ecosystem is how involved trailblazers
are with evolution of our products
is really about to start.
Welcome to Release Readiness
Live from San Francisco.
I'm Gillian Bruce, director of developer
marketing at slack and a
self-proclaimed Salesforce super nerd,
which means I am so honored
to bring Release Readiness Live to you,
because this is where we get nerdy
and we cover the new product innovations
coming your way in the next release.
But before we get into my Happy Nerd Zone,
I need to remind you to only make
purchasing decisions based on currently
available features and functionality.
Now, with that covered,
I want to thank you for joining us today.
Thank you for investing your time
with us to learn what's coming in
the Summer 24 release, and thank you
for all the feedback you provide us.
Your feedback
is what helps our technology teams build
solutions centered around your experience.
We've got a great lineup of release
readiness content for you today.
with our latest innovations for Salesforce
developers with our Developer Preview.
And then after that, we've got a short
break and we're right back here at 11 a.m.
any portion of the shows this week, you
can watch them on demand on Salesforce.
Plus, we are wrapping up Release Readiness
Live tomorrow
with Service Cloud and our Admin Preview,
and you can add all of these
individual sessions to your calendar.
Or you can just hang out here and join us
for the next two days.
Now we're going to spend the next hour
together with our developer advocates
and our product managers in person
here in the studio, who are going to be
giving us live demos highlighting the top
developer features coming in summer 24.
But it's not just about all of us
here in the studio.
We want to answer your questions
about all of these new features,
so join us in the chat on Salesforce Plus.
Hop on over to Salesforce
Plus and ask your questions right there
in the chat where we've got product
experts standing by to give you answers.
And we'll also be taking your questions
for our product managers
and our developer advocates
live here in the studio.
So without further ado, let's get started.
First up,
we have the Amazing and return RL guest,
Mohit Shrivastava,
Director of Developer Marketing here
to show us
what's new with apex in summer 24.
Hello, everyone. Welcome to our all.
I'm a developer advocate here
at Salesforce, and I'm excited
to share all of the innovations
that we have done in apex for Summer
24 for all of you apex developers,
and also for all of you apex developers.
Now focusing on Data Cloud in Apex.
With that, I have divided this
into two segments one focusing on platform
And we'll also get into data
cloud features as well.
So first apex updates for platform.
So I have consolidated
these three features
because I think these three features
are going to have an impact
on all of the code that you are building.
So we're going to take a look at today,
the five level circle relationship
query support that we have added.
Now this is going to make it very easy
for all of you to efficiently
query and manipulate complex data
with relationships.
Next I'm going to show you some demos
on Formula Evolve class.
So with this you will be able to construct
and evaluate formulas
This is going to improve
application flexibility.
And then finally one of my favorite
features for this release apex classes.
So with this we will be able to process
this large circle query results
into smaller batches
within a single transaction.
So, you know, I know you all love demos.
So without further ado, let's hop on
to a demo and see all of this in action.
So first I'm going to start by showing you
an example code
that I have built to showcase this formula
evaluator that's in beta.
I want to make sure that you know
that this these things are in beta.
So make sure you try that
as we bring these features into GA.
So here I'm looking at an account here.
And I have this field called rating
which depends on these two fields
territory and annual revenue.
So based on the territory
and the annual revenue,
the rating field formula is calculated.
Now, you might be tempted to build this
in, formula field.
Nothing wrong with that,
except that when your application
grows, let's say, you know,
you have hundreds of territories
based on zip code, you will see that
it will slow hit the formula limits.
And that's why I'm excited that
with formula evaluator class
and all these kind of functionalities
can be easily achieved.
So let's look at this territory.
So just for an example
I have two territories here, Amr and apex.
And you can see that now within us
custom field called rating formula.
I'm able to directly put my formula there
right away in my records.
And now I'd be able to apply
these formulas directly using apex.
So let's see this in action.
and before I mean,
before we see this in action,
I also want to make sure that, you know,
for this particular trigger
use case that I have when I calculate
this rating, there is also,
a custom metadata record here.
I want to use this criteria
formula as well.
Now that we have this formula fields,
I don't want to hardcode
this annual revenue into my, code
because, you know, today the criteria say
10,000 revenue, maybe, you know,
it changes to some other dollar
amount in the, in the future.
So I want to give
that flexibility to my admin so they can,
you know, keep these criterias
and, and easily manage that.
So I don't want to hardcode them.
so coming back to the demo. Right.
Let's take a look at this demo. in action.
So here I am on an account.
and the rep here, you know, is
one thing is definitely not perform poor.
He just realizes that, hey,
you know, he's just made a wrong
selection of the territory,
so it's going to change it to a back here.
And you can see that the rating field
actually flipped to a better rating.
So what just happened
was the formula got applied.
So let's see this, you know how
I built this trigger, using the code.
So here I am in, in that account trigger.
So you can see that
now I'm using this formula instance class.
And this instance class, uses
builder pattern, as you can see,
and I've been able to pass the criteria
directly by creating that metadata
that I just showed you.
So here there is a simple query.
I'm creating that metadata
that has the criteria formula.
So now my triggers are more dynamic
in terms of the criteria here.
and next I want to apply that
to the rating field here.
So what I'm doing here is there is
a method called get territory formulas.
And you can see there is a map here
that I'm building.
just like best practices. Right.
I want to, write bulk ified code here.
So what I'm doing here is I'm using this
territory rating formula instance for,
you know, and making sure that I have
a map that I can, store this instances.
So here you can see that there is a map
where I'm putting the territory ID
and the formula instance, and again,
the formula is stored on the record,
quadratic formula on the territory object.
so with that I'm able
to simply apply that formula.
And then we are able to just use that
before before update trigger
So this is in nutshell
the the formula evolve
and you can see how you can use it
within your triggers.
and you know, this is powerful, right?
And of course, with, with power,
I want to remind that with great power
comes great responsibility.
So make sure that you evaluate this
formula, test it in the sandbox
apply
this into into production application
and also consider the governance,
you know,
because you need to think
about the governance here
now that all these things are right
in your data as well.
I also wanted to show another example
of this, formula evaluator.
So lightning components,
you know, this lightning component,
right, that you can use apex.
So here what I wanted to show you is
I've just put together a simple UI.
Again,
it uses that formula evaluator class.
And using that I can actually apply
some formulas on my existing records,
like in this case,
I'm just looking at a formula result
which says that, you know,
none of my account records,
sorry, contact records have, the
the formula result as false.
I mean, none of them have last name as,
blank, but let's check for the first name
here.
Let's flip the formula here, evaluate it.
And you can see I have one record
where the first name is missing.
So you can see how powerful it is.
Like you can apply these formulas
directly to your set of records
and then, evaluate them
directly in the runtime
without having to create these fields
on the database layer.
let's try, let's try something, more,
you know, complex here.
It's easy, but let's say
I want to concatenate this first name
and you can see that the formula works
with text data type, return type.
little more complex. You can go here.
And again you can evaluate the thing
that I wanted to show is I'm also reading
all the fields here that I am
using in my formula in my apex too.
And that is because again, these formula
evaluators provide me that capability.
Another capability
I want to show is if there is an error,
apex is also returning that error as well.
Let's try one with the date
fields as well.
So here I am returning a,
the age of the contact, the birthdate,
and just use integer here.
evaluated the dates for all the contexts
that had the birthday.
So you see the power of these formulas.
I mean, they give you this, you know,
super power to make your application
Next thing that I wanted to show
you was cursors.
So let's actually get into the code
example right away.
And, I want to show you an example of how
I have made
use of this feature called cursors.
and added capabilities
to able to make it more powerful.
So you all know viewable classes. Right?
So with cursors, what I will be able to do
is query the set of records
and then paginate them either in the
forward direction or in the backward
direction, and assign indexes to them,
and then process this to show you.
Here is a simple, you know, curable class.
And this class again is
I've called it as cursor arguable.
And what I am doing here in
this class is using power of cursor.
So you can see the system
dot database dot get query
where I'm collecting all the records.
And what I want to show you here
is, you know, I'm able to use this cursor
dot fetch, which allows me to pass
in the current location and the batch size
large set of query records
that I have into the into
the batches
that I want to work with. Right.
So with this feature, you will be able
to, process large
chunk of records
by breaking them into, into batches.
So how about we go and execute this code
and see this in action.
Now to to make it easier for all of you.
What I've done is I'm creating some back
end records called batch log
so we can see the jobs that are happening.
the batch size and also the, the,
you know, the chunking size that,
this courses are going through.
So to execute that, I'm
going to open, a simple file
here, called cursor, a script
that I have written.
how about we go and execute this?
Let's go back to our arc.
and you can see that,
you know, if I try refreshing,
you can see that there are jobs
that are happening now, real time.
And you can see that, you know,
so the the script that I had here
used as simple query
and then I set the batch size
and I'm able to just cursor through the
the records and process them.
it's going to make your async jobs
more powerful.
and I'm really looking forward to you
to see how we are going
to make use of this feature.
in your, all your, async operations.
the next feature that I wanted to
to demo here
was, the file level SQL support.
And again,
I have a code snippet to show you.
So let's open this code snippet.
So as you can see here right.
This is a nested query
where I have accounts.
And then I have assets and then line items
and work orders for them.
So this is a nested query.
so if you are working on
use cases like field service
where you have like, you know,
multiple objects that are linked,
we can easily make use of these,
five level nested circles.
And this is going to make it very easy
to retrieve records.
Again, this is one of the features,
I would say that it makes it so powerful
that you can query all these records,
but then just also be mindful
of how your processing these records
and how many records you are pulling
when you are using nested level cycles.
want to move back to the slides so
we can cover some of the exciting features
that we are bringing for all of you apex
developers who are working on Data Cloud.
Now, you all know Data Cloud.
Data cloud is making every cloud better.
definitely you can make use of Data Cloud
to solve some of the use cases
that we have never been able
to solve with Apex.
Now we want to make this easier for you.
So there are two different features
that I want to highlight here.
First, the static circle in Apex for data
model objects.
So in apex
now you have static circle support.
So which means that when you are working
with these data model objects
you can use circle just like how
you are using for regular objects.
So with that you will get strongly typed
results.
it eliminates typecasting.
Previously you could use CDP query classes
as well, and you can put your circle
or SQL there, but it required for you
to manually typecast validate that query.
And all of them was a lot of work.
So we want to eliminate that work
and make it super easy and streamline
the developer experience
that you are used to.
you can
now mark these, circle for data cloud
so that you can test all these queries
that you are writing within your apex.
So how about we actually jump to the demo
and show this feature in action.
So I have a simple class
to show it to you.
It's called Reservation Controller.
So I'm using this class to pull
all the reservations that are there
in the data cloud for a particular contact
in, in Salesforce.
so I can just run the query.
You can see that, you know,
I'm able to now just use circle directly
on these demo objects directly, just like
how we are used to on the platform.
so here I'm actually looking for a table
called Unified
Link as a sorting individual coded table.
So, you know, all of you
working on Data Cloud, you know that
this is the table that you sort of unify
all all your records.
So now I'm able to query that table
just like as its normal platform queries.
and then you can see that I am able to now
easily pull the reservations.
finally I wanted to show the feature
where,
we are going to make it very easier
for you to test all of these with the
the mocking service
that we are giving here.
where I have written a test code
for this reservation controller.
So here you can see that
now you can have a stub provider
and we can implement a stub provider,
using this circle star provider interface.
So all we need to do
is extend this interface.
And that is a method
that you need to override called handles
So here I can mark out all the queries.
So you can see that the
the query that I have on this object
kid I'm able to mark it out.
and you know, even for the reservation
object, I am able to mark out the results.
and if you go here, you can see that,
you know, able to stub all of these
and then test my code.
Let's just run it quickly to show you
how it can, you know, run.
and you can see it's all happening
real time.
and it was instantaneous because now
we are working with the mock records.
So I hope with all of this innovation,
it's going to make it easier for you
to work with, you know, apex and, it's
going to make your applications powerful.
What do you think
about all these features?
I mean, there's a lot of apex improvements
that's really, really powerful.
I'm personally very excited
about the formula evaluator.
I think that that's very cool to see that
great way to work with your data
right there in Salesforce.
Thanks for sharing all that with us. Yeah,
absolutely.
Now as a reminder for all of you tuning in
both on LinkedIn and on Salesforce.
Plus please ask your questions.
So we have experts standing by
right in the Salesforce Plus chat.
And we are going to be
taking your questions live here for Mo.
So we are going to go ahead and start
with the questions
because you're in the hot seat.
Yeah I'm ready okay. All right.
First question we've got is
let's talk a little bit more about that
formula evaluation.
what are some more use cases?
Yeah I know,
I know, it's hard to find use cases.
but, I'm gonna, you know, show you
some couple of examples, share some,
couple of examples that I have myself is
one is criteria based sharing rules.
So you know that there is a limit
like 100 criteria based sharing rules
that you can have on, on an, on Salesforce
with this there is no limit.
You know, it's like, okay, you know,
you can have a bunch of accounts,
you can put your sharing rules
in the form of formulas directly
on these records so it becomes unlimited.
going to, you know, it's
going to make it very easier to process.
if you are working with like Facebook
entry or CPK, use cases where you can,
you know, you have different rules
for different products, right?
that security management, again,
you can put, you know, transaction
security policies,
you can configure all these rules in Apex.
So these are all
some of the use cases that I can think of.
really opening up a lot of new ways
to kind of work with and evaluate data.
Yeah, I mean absolutely. Yeah.
okay. We're going to keep going.
we've got a question from Chris
on Salesforce.
Plus, the question is, compiled
formula fields have a character limit.
Is there a character limit on the formulas
evaluated in apex as well?
Yes, there is a limit, but I hope
that you will not run into these limits
because, you know, you can break them
down, you know, in the form of records.
And also think about, you know,
how you can like,
you know, break them into micro formulas.
Let's call it microformats.
Maybe I invented I like that,
I like formulas and microphone. Yes.
Well we've got more questions. So
next question, we're
going to go into talking about soccer
because you showed some very cool
soccer innovations.
can we make regular soccer queries like
you were able to mock soccer against DM?
I wish we could, you know, data cloud
developers,
you are lucky you get that, feature,
but hopefully we, you know, in the future
will have something like that. So.
But we will put as a request
for the, product team.
So they are aware of this, and hopefully
they provide something like this.
I mean, there is a star provider today
that you can use is a little more
challenging to work with it,
but we can make it better of us.
So great, great feature request.
Sounds like a good thing
to put on the idea exchange.
Yes, it is set up for the exchange.
By the way,
if you don't know what the exchange is,
it's a way that our product teams
actually directly get feedback from you
about the features
you want to see in the product,
and they use it when they're planning
what they're going to build each release.
So if you haven't checked it out,
go to the idea exchange.
Another question for you, Mo.
Can you explain what use cases
you would use cursors for versus
using batch apex?
That's that's really an awesome question
actually.
I mean, batch epics
also breaks records into batches, right?
So the thing with batch effects is,
you know,
if you have worked with batch effects,
they have a limit
like 2000 rows is the maximum batch size
that you can have.
And a lot of our customers have,
you know, more than that set of records.
and they can process that within
the transaction limit that we provide.
So that I think cursors are great
because for those use cases, right,
where you can, let's say process
5000 records style, you know,
within that transaction,
maybe I'll use cursors now.
processes are good
for like long processing,
because those are going to be effective
if you want to get your job quickly done.
Different scope,
different types of projects.
okay. We got another question.
This question is from Anna on Salesforce.
Plus, will the formula evaluator
be included directly in the flows
I think I will need to tap into the flow
product manager to get an idea on that.
but you know, as a workaround,
you can always write,
you know, apex, little bit of apex,
and put it as a part of the flow.
You know, you can use apex, right?
Invoke methods, put your formula there,
then expose it into your flow as well.
Okay, I like that. Well, we had the flow.
We had flow
release readiness live yesterday.
And we definitely have those experts
standing by.
So in the trailblazer community,
if you still haven't gotten that question
answered, ask it over in there
and, we'll get an expert answer for you.
you are not done being in the hot seat,
but you're moving to a less hot seat
because we have another presentation
to get to.
there's more great stuff
coming for developers.
Next up
we have the incredible Daniela Rai, senior
developer advocate, here to give us
the latest innovations from Data Cloud.
Hello Salesforce developers.
My name is Daniela Raggi
and I'm a senior developer
advocate here at Salesforce
focusing on the data cloud platform.
I'm really, really excited
about the summer 24 release
and all of the new innovations
that are coming out to Data Cloud.
So speaking of those innovations,
I'm going to give you my top
As you know,
our releases have a lot of innovation.
So it was really,
really hard to choose these five.
But I'm really, really excited
about each and each one of these.
So the first one is the ability to connect
data Cloud to Heroku natively
and get data from a Heroku Postgres
SQL database.
as you know, Heroku is a platform
to build apps, websites,
and a lot of data is fed through Heroku,
and that data will now be even more
powerful with the data cloud connector.
We also have some new enrichment,
some new enhancements
coming out for CRM enrichment.
the first one in the form of copy
filled enrichment.
Previously, you can only use copy fields
with the contact and lead objects.
But now we have expanded support
for multiple Salesforce
standard objects, as well as a variety
of Salesforce Custom Objects
and for CRM enrichment
relate CRM enrichment related lists.
we previously only had support
for the contact and lead objects,
but now we have more support, by expanding
that functionality to the account
standard object.
for all my developers,
we know that you love working with APIs.
We have a new API that is available with
data graphs that I'm going to demo today.
That's going to be much faster
and have a lot more performance in it
than just using the connect APIs
and the query APIs for pulling data
And then we have second
generation packaging,
which is now currently available
for ISV partners,
which is going to make packaging data kits
and sharing,
all of the cool things
that you're building and data cloud
or on top of the data
cloud platform, more shareable.
and more available
to be able to publish to things
like Appexchange as well as version
control and source control.
So a lot of amazing updates
that our product managers have released,
coming out with the data cloud updates.
So let's first get started with Heroku.
So as I previously mentioned,
Heroku is an app as a platform
for building your own applications.
web apps, mobile apps,
and a lot of data, as you can
imagine, is being fed through all of these
applications as users are using them.
So there's a lot of behavior
analytics views
and all of that data can be captured.
And Heroku is very own Postgres database.
Well, with the new native connector
to data cloud, all of those interactions
are now available to be ingested
into Data Cloud's data lake,
so that you can start using Data
Cloud's functionality like activations
reporting, and and segmentation
and all of the functionality
that can plug into data cloud.
So I'm going to go ahead and demo
that for you right now.
So let's get into the demo.
So here I am in Heroku
and I'm in my Postgres SQL database.
You can see here to the right that I have
a few tables already that are already
collecting some information,
into my Postgres SQL database.
for guests, table here for metrics,
as well as a table here for systems.
And here I am and now data cloud set up.
So I already have my connector configured
so that I can get in data from Heroku
and to my data cloud org.
I'm going to click
into this Heroku connector.
And you'll see that this is as easy as
just doing a little bit of configuration.
So all I need to do is put in the username
and password for my Heroku user.
I just need to put in my connection URL,
as well as my database name
and just as easy as that
and makes the connection
between data cloud and Heroku
so I can start ingesting data into my data
lake from Heroku.
Now, speaking of ingesting data
into my data lake from Heroku,
the way that we get in data from,
our external systems
into data cloud is via
something called data streams.
So here I am on my data streams tab.
And then I'm going to go of go
and do is click this new button.
see that I have a new Heroku Postgres
SQL tab.
Right here I'm going to select this tab.
This means that this is now and available
as a source system to ingest data from.
And I'm going to click next.
And then I'm going to choose my schema
where I set up my tables.
And right here you see
I have my guest table.
My metrics table and my systems table
which were in Heroku.
I'm just going to select
one of these tables and click next.
And from this screen
I can set my primary key.
And I can set the fields
that I want to ingest.
So all this data can easily
come in from Heroku into data cloud.
And that is so powerful
because all of that data that's stored in
just with a little bit of configuration
is now available
for all of my activations and data cloud.
Now let's look at our next innovation
that also came out
as of the summer 24 release.
So copy field enrichments.
Now I love copy filled enrichments
because I used to get a lot of questions
when I used to go to user groups
and speak about
how can I get my data
to copy back into Salesforce CRM?
Well, enrichments is a really,
really easy way to copy
data from Data Cloud
Data Lake over into your Salesforce CRM.
It's all configurable and very,
very easy to set up.
You can copy over fields on data model
objects, as well as calculate insights
that you build in data cloud and all of
these can be displayed on a field on now
multiple Salesforce standard and multiple
Salesforce Custom objects,
and it even keeps it sync for you.
so that you don't have to worry about data
being,
you know, not refreshed or out of date.
Now let's go into the
demo to see how this works.
So here I am at my Data Cloud org.
And previously you might remember
when you would access enrichments
you'd have to go to the contact
or leave set up.
Well now this is all been moved over
to a centralized area and set up.
So I'm going to go
into my Salesforce setup.
And then I'm going to type in enrichments
here.
And I'm going to select Copy Field.
And and this is where I will configure
all of my enrichments for data Cloud
to pull over the data from Data Cloud's
Data Lake and Data
Model objects over into a CRM object.
So I already have a couple of enrichments
configured here for the account
I'm going to click at the top.
And you can see here
that I have the ability
to copy over my open cases
field on my data model object,
as well as my last modified date,
and display those on the account.
I can see some details
about the data source object that's
configured, and I can even see
some details about the sync history.
So I know how recent the data is
that's coming over into Salesforce
Now this is super powerful because
once again it's just configuration.
You can easily pull this over.
Previously you had to use the connect API
and some queries,
and it just made it
a little bit more difficult.
this makes it easy to do and as little
as five minutes with configuration.
Now that you've seen that,
let's look at our next innovation
that's coming out for the Summer
24 release.
So continuing with enrichments
we now have related lists enrichment
which were previously also only available
on the contact and lead object.
But we know that for Salesforce,
one of the main objects
that people are working
in is usually accounts.
And so we have now expanded
this functionality
to be available on the account object.
So let's go into the demo
and see how I can display
these related lists on an account.
So here I am and my data cloud shut up.
And you can see here
that I'm on an account or excuse me,
I'm actually in Salesforce setup
and I'm on an account on the object.
And I have a data cloud related lists
right here selected.
And I was able to bring over cases
from an external system
and have those displayed on an account.
Again, just all configuration.
It's really, really powerful
for either a developer or an admin.
to copy this over
and it's going to enable you to display
multiple related objects in multiple,
related data model objects on an account.
It's going to be very, very powerful.
And I know developers will really,
really enjoy working with this.
Now that we've seen that,
let's look at our next enhancement.
So our next enhancement is with the data
Graphs and Data Graph API.
So with the new Data Graphs API you now
have the ability to create a data graph
which allows you to create relationships
between multiple data model objects.
And then you can use the API to query
for a particular object,
as well as its related objects and records
from those related objects.
Let's get into the
demo to see how this works.
All right, so here I am in my data graph
and you can see that
I have my primary data model object
which is my individual object.
And you can see that I have a couple
of related data model objects.
The contact point address contact
point email and contact point phone.
I'm going to go ahead and just click
preview so you can see the Json
structure of the data
and how it's structured on the back end.
And so what I'm actually going to do
is I'm going to use this data graph
that I built to query an individual
and pull back its related
contact point address, contact point
email and contact point phone records.
So in order to do that,
I'm going to use postman.
And the first thing that I need to do
is do an authorization
from postman to the Salesforce Hub
that is hosting my data cloud org.
So I'm going to go up here and perform
my first authorization request.
I'm just going to click send.
And what this is going to do, it's
going to use OSC to feed me back a token.
And then I need to perform a second
authorization
to the tenant specific URL
where my data cloud is hosted.
So in order to do that,
I'm going to do another Post
And what that's going to do
is it's going to exchange
my previous token for a new token.
And after that I am all authenticated
so that then I can actually
use that data graph that I just showed you
to pull back a single record
sum of feed this a single record ID,
and what it's going to do
is it's going to query the individual,
and then it's going to return me
the associated contact point,
email address and phone number
for that individual.
Let's go ahead and see this
how this works.
And just as quickly as that
I now have back my record
as well as all of the related records
using the Data Graph API.
This is going to be way more powerful than
using the connect API to query records.
It's going to be a lot easier
to be able to query a primary record
as well as the associated records.
So really, really excited about this
data graph API that just came out
as of the summer 24 release.
Now let's look at the final thing
that just came out.
So we now we previously had first
generation packaging,
and some people might not have been aware
that you could already package up data
lake objects, data models,
and different metadata
that you built inside of your, data
cloud org.
generation packaging, which is available
currently only for ISVs,
but we will be expanding this, soon
so that everyone can use
second generation packaging.
This is going to be a lot faster
and a lot easier
to be able to package up all of the cool
things that you're building in Data Cloud.
You'll be able to share these packages
easily,
and you'll even be able to use them,
with source code control,
like get, as well as scratch orgs
to further
integrate with the dev ops process
right now for developing.
Now, let's
go ahead and take a look at what,
how one of these, packages look like
and how they work.
So here I am in my code editor,
and what I already have done is created
a data packaging kit.
It's already consists of multiple source
objects.
I have some data model objects in here.
I even have some calculate insights
packaged up.
And now what I can do with this is
I can share this to multiple scratch orgs.
I could share this to multiple Salesforce
orgs.
I could publish this on get,
and I could even publish this
about an Appexchange if I wanted to.
So really, really amazing things
for developers to be able to share
all the cool things that they're building.
really excited about everything
that came out in the Summer 24 release.
I hope that you are too, Julian.
I mean, this is like a data cloud palooza.
There are so many amazing things
happening.
Come on over here
because we want to talk more about it.
And I know we've got a lot of questions
coming in in the chat,
but Danielle, you showed so many things.
I mean, I know we've been talking
nonstop about AI, but it's really like
And so data cloud is so, so important.
I mean we share a lot. Thank you.
but we got a lot of questions for you.
so we're going to go ahead
and get started.
So now you're in the hot seat.
You ready? Yes. Ready.
Mo Mo, you can take a little break
in the warm seat.
well he warmed this one up for you.
okay, so let's actually let's keep talking
about data cloud data kits.
what are the differences between
first and second generation packaging?
Yes, that is a great question.
packaging, you previously couldn't
use it with a scratch org.
with second generation packaging,
you can now package up
all your data models, data caps
and push it to a scratch.
Or you can retrieve it from scratch
or and promote it to other orgs. Now.
Okay, I like again, like again, the theme
I think of all the things you showed
was just like, it's making it
so much easier to connect to your data
without having to kind of go
the API route,
really kind of building that right
into the platform, which is really nice.
Yeah,
this is going to make it for everyone
that's been asking me about sandboxes
and data clubs.
That's actually the next question on here.
That is going to make it easier
also to promote changes to data
cloud sandboxes when it comes out as well.
That was Eddie's question from Salesforce.
Plus add she already knew
that you had that question.
It was
when can we use Data Cloud in a sandbox?
excellent. Okay. More questions for you.
Unless you're going
to predict the questions, I mean it,
right. I'm doing a good job.
okay, the next question is can data
ingested from Heroku
be copied over to Data Lake objects
and be mapped to data model objects?
Yes, that's a great question.
So similarly to how you can map any data
that's coming in from any other
data stream, you can also, map the data
that's coming in
via Heroku as well to data Lake
objects, data model objects.
And you can use it
with all of the functionality
that's available in data cloud,
like segmentations, reports, activations.
It's really going to open up
the possibilities for Heroku
in terms of being able to use the data
cloud platforms activations with it.
I mean, we are all one
happy family, right?
So it's nice that we work.
Heroku is more closely, closely embedded
with our third core platform there.
okay. We have another question for you.
This is a question from Salesforce.
Plus, this is from Kenneth.
Zero copy sources available for data cloud
looking to use Azure Data Lake.
Oh that's a great question.
So currently for zero Copy
we have integrations with snowflake.
And we currently have one with Google
BigQuery.
and I believe Databricks as well
as is on the roadmap for what's coming.
All right. Well,
it sounds like, kind of stay tuned.
There's more. There's more coming.
There's more coming.
Another question for you, Daniel.
what's the difference between copy
field enrichments and related list
I mean, we saw them on the screen,
but talk a little bit more about that.
Yeah, that's a great question.
So, copy field enrichments allows you to
copy over a field that's not a data model.
to how in Salesforce you have standard
and custom objects and data cloud.
You also have standard and custom
data models.
And then on each of those data
models are field.
So you can with copy field enrichments
you can copy over the actual
values are in a field on the data model
object
over to a Salesforce
standard or custom object field,
whereas with related lists
you're copying over records,
just like how related lists and Salesforce
are records on the primary object.
So you'd be copying
over a table of records.
I got that I mean, it's taking me back
to kind of that analogy of like a record
is basically, you know, the rows
and the columns of the spreadsheet.
It's a row of the spreadsheet versus
just what's in that individual column,
So, okay, I like this company
giving me context to okay, great.
We have one more question for you, Daniel,
before we move on, because guess what?
okay, this is a question from Vijay
on, Salesforce Plus, can we pull
multiple custom objects and show it on
data graph at the same time?
So data graphs will be available to work
with both standard and custom data models.
And you will be able to pull and multiple,
nested data model objects
related to a primary data model object,
similar to similarly to how we show with
individual being the primary and contact
point, email, phone and address.
Those are three separate data models
that are all related to the individual.
So yes, you will be able to do it
with nested and multiple nested objects.
I love that. Well thank you.
thanks for handling the hot seat.
You're not done yet, but
you're going to stay on the warm couch.
Okay. Is we got more to cover.
Thank you so much, Danielle.
As a reminder, keep asking your questions
in the chat on Salesforce
Plus because we've got more
great innovations coming your way.
Next up we've got Luke Levasseur, product
manager here to show us the latest
and greatest with Einstein one.
Take it away Luke. Amazing.
Thank you so much Jillian. Hi everyone.
I'm Luke Levasseur, I'm a product manager
on our AI cloud products.
and I'm really excited to show you
what's new with a giant
with the Einstein
generative AI platform today for devs
to kick things off, I want to
touch on a few features at the Einstein
one platform model agility layer.
To start off with,
we've heard your requests
for a new set of composable APIs
for use in your applications
to interface directly with the Einstein
Trust layer.
This is why we're exposing our models API
to you
all via apex and rest in the summer time
frame.
Additionally, we've heard
you asking to use additional models
hosted on different model infrastructures.
We're now exposing models
hosted on Bedrock and Vertex,
such as Anthropic Cloud and Google
Gemini in the summer release.
And on top of all of that, we know
that good AI is powered by strong data.
That's why we're enabling you to use
Data Cloud's vector database
for semantic search
on your enterprise unstructured
data via Apex and Retrievers
in the summer release.
So let's take a look
at the models API in action.
Classes page and set setup, I first
want to show you where this all lives.
The new models, API class, and child
classes.
Here are what we're going to build
on top of for the remainder of this demo.
Now let's take a look
at the four new endpoints we've exposed.
The first is the Create Generations
endpoint from the models API.
This endpoint takes a prompt,
a set of model parameters,
and a model name, and generates
a single shot response to that prompt,
which you can then compose in apex
or rest use cases in your applications.
The embeddings endpoints again
takes a model name and a string of text,
and generates a vector representation
of that text for semantic comparison,
Feedback is your way to provide
human generated feedback for a given
It takes a chat generation ID, human
generated feedback as well as a sentiment,
and then it stores
this via the model API to data cloud.
The last endpoint, which I'm super excited
to introduce
is the Chat Generations endpoint.
This is a conversational endpoint.
So it takes actually a history
of conversational messages
sent between an LM and a user and provides
contextually relevant output from the LM
based on that historical context.
Now let's go take a look at what we can do
with chat
So navigating to our seller home,
I can see in the bottom right corner
of the utility bar, I now have an AI chat
app that I've built with models API here.
I've just done is I've pasted in a
like very long meeting transcript
to the AI chat app, and I've asked it
to summarize the meeting.
So while the AI chat bot thinks or upvotes
idea on an idea
exchange,
shout out idea exchange once again,
what's happening behind
the scenes is we're sending this
transcript off to the LM,
and the LM is generating that summary.
but I really want to know
what are the action items.
So let's ask our chat bot right here.
And you'll notice
what are the action items.
Doesn't mean much without the context
of the previous messages.
This is where that conversational history
that's enabled
by the Chat Generations
API really comes into play.
So here the chat bot is able to respond
with the two action items
based on the previous messages
from this conversation, enabling you
to have these conversational, contextually
relevant experiences with your LMS.
Now let's go back to the slides
and see what's new in Prompt Builder.
So in Prompt Builder
we have a host of new features
to introduce to you all
following our GA in February.
The first that I want to call out
is that we've heard your
your complaints that the error messages
that we're surfacing and prompt builder
So we've improved these error
messages, surfacing relevant
context to you
about why an LM response may have failed.
We're also enabling you to use free text
inputs in prompts as grounding resources.
This lets your users define at runtime
the text that they want
to ground their prompts on.
Snapshot grounding automagically grounds
your prompts in the most relevant fields
from the user's page layout.
Large text inputs ensures that
those pesky errors related to overflowing
a context window for a given model
won't bother you anymore.
And last
but not least, structured outputs ensures
that when you request Json from an LM,
you'll get a response in that format.
So let's let's take a look at a couple
of these in Prompt Builder. Now.
So here I'm in front builder.
And I'm going to open up
the summarize open cases prompt.
The first thing I want to show you
all here
is the model selector in this right
hand panel.
It's not there quite yet
because this is coming in June.
see standard models for Anthropic
and Gemini in the June time frame.
Today
we're just working against open AI models.
In this prompt,
we're asking the LM to roleplay
as a support representative, summarizing
a set of open cases for an account.
You'll notice also in the prompts,
we're asking for a specific format, right?
This prompt is asking for paragraph
formatted summaries per case.
But what if a different support agent
wants bullet point formatted summaries?
Historically, we may have had to use
two different prompt versions
or two different prompt templates
entirely to reflect those different
formatting needs.
Now, however, with the ability
to provide
free text into a prompt as a merge field,
we can do this in run
in real time at runtime.
So you'll see here
we specified the format as a merge field,
which we've inserted via
this menu in the resource picker.
And then we're actually able to specify
what we want
that format value to be in Prompt Builder
right here.
So when we preview this prompt
with the format specified of bullet points
against the edge communications account,
we'll see that the response back from
the other arm is returned to us
in bullet points, and a different user
would be able to type in paragraphs
or any other format in runtime
in their application experiences.
So that wraps up my demo for today.
Gillian, really excited to to hear what
you think of all these new end points.
I mean, I love that you can actually tell
basically that, you know, the chat bot.
Hey, I want it in this format and you can
now that's a merge field instead of, like
you said, having to build multiple prompts
before multiple prompt templates before.
Yeah,
there's there's so much new flexibility.
with, with all this new functionality,
we're really excited
to put it into your hands
and to see what use cases you act on.
Yeah. Well, thank you, Luke, for all that.
And welcome to the hot seat.
It's been warmed up twice now.
So, we got a lot of questions for you.
All right.
As a reminder,
for all of you tuning in, please
ask your questions
right here in the Salesforce Plus chat.
We've got experts
standing by both in the chat
and then also right here in person.
So let's get going
with some of those questions.
the first question
I have for you, Luke, is,
which models can I use with the models
API endpoints.
endpoints are compatible with any model
you've configured in Model Builder.
these today are open AI models.
They can be hosted on open
AI infrastructure or Azure infrastructure.
And then in the summer time
frame, as I mentioned
earlier, will be opening that up
to anthropic and vertex models.
sorry, not bedrock and vertex models.
Again. Also reminder forward
looking statement to all of these things.
We'll have some roadmap revenue insights.
We have a question here from Salesforce
Plus for you, Luke,
how does Einstein
AI ensure data privacy and security?
Aren't we using company data to train the
AI model?
And that's something that we hear a lot.
first thing I'll say is, no, we're not
using company data to train the Am model.
there are some, some roadmap items
that we're looking to for that,
that you can opt into around
fine tuning your model.
but we have an entire Einstein trust
layer built out,
that the models API sits on top of that.
Ensure that company privacy is
is respected. Right.
So a couple of features
I'll call out in the trust layer are,
So when we retrieve any field values
from the Salesforce org, we're retrieving
those with the executing users context
so that if the user doesn't have access
to a field, they won't be able to read
that or send that to the AI model.
additionally, we have zero data
retention agreements with all of our model
so whether these models are hosted
on OpenAI
or on Azure or on Anthropic and Gemini
coming soon, none of the data
that is included in a prompt is stored
and used to train those models.
and then, yeah, again,
so we're sending the majority
of this data via
the prompt, as grounding data. Right.
And none of that data
is being stored by the model.
So you can rest easy at that.
That instant trust layer
has you taken care of your company.
Your data privacy is being respected,
as we like to say.
It tells for us
your data is not our product.
Thank you for explaining that
a little bit more.
some we're getting some love
and some shout outs
from the amazing viewers that are on.
First of all,
apparently we have attendees from Calgary,
from Mumbai, from New York and Wisconsin
all around the world.
So thank you all for tuning in.
and Anna is very excited
about all the new features
and looking forward to using them.
We're excited
to see what you do with them as well.
All right, Luke, coming back to you,
we got some more questions.
where can I find the context window limit?
So the best place to look for
that is in the help documentation.
that varies model to model, right.
with a few of our models.
Also, we have it in the model name, the.
So for standard models, I'll say,
but yeah, it's again
a function of the model that you choose
to use, with the models API or
with any of our other JNI functionality.
Excellent. Thank you for answering that.
okay, so this is actually something
I was really interested in seeing that in
the demo was really great
about using page layouts for grounding.
So whose page layout is really used
for that record layout grounding.
So we recognize that different users
have different page layouts.
and the relevant fields are going to be
on a specific user's page layout.
So when we're looking at this records
snapshot grounding functionality,
we're actually going to pull
in the executing users page layout.
might be grounded on different data
for an account executive versus a service
agent looking at the same account record,
because they have different
fields on their page layout.
That makes sense.
And then you tie in kind of the trust
layer of it all, and making sure people
are only seeing and getting access
to the data that they're granted
using existing permissions.
It's helpful. Exactly, exactly. Yeah.
All built on top of that.
Well, everything's
built on trust here at Salesforce.
So, Luke,
thank you so much for sharing that.
We're going to have plenty more questions
for you.
So, don't get too comfortable,
but we do have plenty more to get to.
In fact, we've got a whole nother section
to get to all of the amazing innovations
coming for developers this release.
So last but not least,
we have the amazing Angela Lee, Director
of Product Management here to show you
how you can customize Einstein Copilot.
Take it away, Angela. Thanks, Jillian.
Hi developers, my name is Angela Lee
and I'm a PM on the Einstein Copilot team.
Since our announcement of our GA
last month, copilot has really taken off
and I cannot wait to share with you
the new features that we
First up, we have scratch org support.
We know that developers are already
really familiar with Scratch Works,
so we want to make sure that you can build
and test your new copilot features
Developers can now build
and test new configurations
without disrupting their users
or active Copilot.
And once you feel that your changes
and customizations are perfect,
they can easily be deployed
with a quick push
One of the most powerful things
about Einstein
Copilot is the ability to customize it
for your organization's specific needs.
And that's why
we wanted to give even more control
over to our developers
by having our new metadata API.
With this new API, you can go and see
the metadata behind your custom actions
and really understand what's powering them
and make changes at that metadata level.
And then once you're ready, you can
deploy it across all of your environments.
Now, this
next feature might seem really minor,
but it is a really big contact
switch for copilot.
With page contact support,
we can now have copilot
automatically understand
the context of the page that a user is on,
and it'll be able to ground their response
on that page context
without having to ask clarifying questions
or disambiguate
it before it provides a response.
And for developers,
this is really important
because this means that you're going
to get that page context
understanding
built into your custom actions.
This means that your actions
are going to become even more accurate.
And finally,
I am so excited to announce the beta
of our newest feature,
Standard and Custom Topic support.
We know that copilot, our most powerful
when they are specifically solving
a use case or a job to be done,
and now topics can help you do that.
With topic support, your users are going
to unlock even more productivity
because topics will help them
solve their specific use cases.
That was a lot of talking.
And we know a demo
is worth a thousand words.
And this is where we want
to bring in our scratch work.
So you would normally go about
bringing in your scratch work
the way you normally would.
But now here is the new definition file.
So you can bring in Einstein
Copilot features.
Just make sure that your addition
is set to Einstein,
and that you're
also going to bring in copilot.
And then you're going to be good to go.
So now that we have our scratch orgs
ready to ready and up and running,
let's take a look behind the hood
and look at look at our metadata.
So I already ran a function that will
bring in the metadata for all of copilot.
And as you can see here, we have metadata
that covers our planner service,
which is the orchestration
framework for copilot.
And then you see
all of these different functions.
And the ones that are named employee
copilot represent
the standard out of the box actions
that are built by Salesforce.
As I keep scrolling down,
I see this additional a couple additional
functions that are named differently,
and those actually represent
the custom actions that are related
to this particular copilot.
So I have a question about this tracker
update capacity.
So I want to take a look
more closely at that metadata.
So I'm going to run this gen AI function
to grab the metadata
for that custom action.
And as that runs just in case,
it was going a little slow.
And we don't have a ton of time.
I already ran this earlier.
And here is the metadata
that is behind that custom action.
All right. It actually ran.
So as I can see, this is a custom action
that was built in our field service
app to understand capacity, which is super
important in the field service use case.
I know that it's targeting a flow.
And here is the description
which is actually just as test.
But we know that descriptions
are super important
for copilot to understand
which actions to use.
So I'm going to update
the name of that description.
And once I'm done with my changes,
I can quickly right click
and then deploy this to our orgs.
And that will be available across
all of our different work
at all of our different environments.
All right, let's switch contacts
and get an understanding
of how page context support works.
So I'm going to go ahead and submit this
query that says summarize this account.
And before we have page context support,
what Copilot would do
is actually come back to us
and say did you need account
Acme partners or opportunity
related to partners or leads?
And I would have to, as a user, go back
and click and clarify which one I meant.
But now with page context support, Copilot
is going to be able to reason through
and understand I'm on this account page
and to pass that record ID
and then be able to give me a summary
without asking
for additional clarification and pull up.
And now I have this nice summary
with links out to this page if I need it,
and and reach out to that owner,
Angela Lee, in case I have any questions.
And what's really powerful for you
as a developer again, is like this page
contextual will automatically be passed
to your custom actions.
So you all have to build that in
and it's going to make the experience
for your users even better.
And the results even more accurate.
And now to our final topic
So here I am in the copilot builder
and with this copilot
it's really tasked with understanding
restaurant reservations.
I already have a few topics
that are related to this builder.
And as you'll notice,
they're all assigned to
they all tackle a specific job
that needs to be done.
you know, how to handle escalations,
how to deal with existing reservations.
And I'm going to go ahead
and show you how to create a new topic
that relates to answering
frequently asked questions.
go ahead and start with the topic label,
which I've general label general.
And describe
what this topic should settle for,
which is in this case answering questions
about restaurant reservations.
You know, how
someone could use their dining points
or anything related to our company.
And then I want to define the scope.
I want to make sure that copilot only uses
our internal knowledge articles
to provide information to users.
And then finally, I can set rules
and guidelines for copilot.
So it's always answering the questions
in the way I want it.
So for this case,
I really want to make sure that
copilot is tailoring responses
based on specific user queries
or their account details
if they have provided them,
and then to never give
troubleshooting, steps.
I only wanted to answer questions,
so let's go ahead and take it for a spin.
I put in a question about how to activate
dining rewards towards a reservation,
which is a pretty common question.
I could expect for,
restaurant reservation software.
So what copilot is doing right now
is kind of understanding
the intent of the question
and going and finding
what's the right topic
to answer this question.
And voila, I see that it said, okay,
this question is related to a pretty
frequently asked question.
Here are the instructions
that I already got.
And here's the action that is assigned
that can answer this question directly.
So in this case it's answering questions
with knowledge.
And the L1 went through executed that
and gave me this really robust response
about how I can locate my dining points
and activate them and apply them to
my, next restaurant reservation.
So that's how you can use topics
to guide your copilot
to answer questions
and build actions towards any use case.
So that was my last topic for the day.
Gillian, what was your favorite
copilot topic that we covered today?
Well, I mean, I think it's amazing to
just see how easy it is to really set
guide your Cobot really customize
that experience for your users.
I mean, yeah, anyone could do it.
I wish it would do things like teach
my toddler to eat whatever they want,
and then you figure that out,
you let me know that same problem.
More the mac and cheese, please.
Well, we have a lot of great questions
coming in.
tuning in, please ask your questions
in the chat on Salesforce Plus,
because now we've got our full lineup
of all of our developer
advocates and product managers here
to answer your question.
So, Angela, we're going to come to you
because there are some questions,
topical questions coming our way.
so the first one is about topics.
so how do you think about using topics
versus actions?
So the way I think about topics is to
really be focused on your job to be done.
So think about that one specific task
that you want to do.
what we showed was answering an FAQ,
you answering frequently asked questions
and actions are
what the action is going to take,
what the copilot action is going to take
in order to fulfill that request.
So the one thing is to keep in
mind is that actions can be assigned
to multiple different topics.
So we know that across different topics,
you might need to look up a record ID
or identify related lists
or things like that.
So think of it as actions can cross
multiple different topics.
but topics should be discrete enough.
So that's solving a specific use case
like thank you for clarifying.
It helps to understand like okay
which use case do I use.
Which thing. It's very helpful.
okay. Another another question for you.
let's talk about how again talking about
like appropriate use cases
for appropriate technology.
How do you think you would use, each
tech and custom action, custom actions.
prop builder versus flow versus apex.
Yeah.
so I mean, it's great
that we have apex that
return and data
cloud and prompt builder here.
so I really think it depends on
what your specific use case is it
or also where your data lives.
So apex is great
if you need to go externally.
You're to pull data
from different sources.
and then prompt templates or prop
builder is great
because you can really define
how you want, the prompt to be answered.
And you can design around
your specific use cases.
So, you know, really think about,
where everything in your system lives
and you know, what makes the most sense.
And the last thing I would say about, the,
the great thing about copilot
developers already have a lot of things
in flows, in prompt templates, or in apex.
And so you can repackage
a lot of those things
and build custom actions with them
without a lot of rework.
that's kind of like the big overall value
prop of the platform, right?
Is you've got all of these different tools
to accomplish these great, efficiencies
and different types of, you know,
you can make it help
make your users more efficient.
And but there's multiple ways
to build them.
So you can really be creative
and figuring out
which one is the right
one to use for which task.
Coming back to you, Angela,
have you got more questions?
talk again, kind of on topic.
The same idea when to use which thing?
let's talk about the difference
between bots and copilot.
Oh, this is something I know I struggled
with when I started as a PM on the copilot
team was like, what's the difference
between copilot and bots?
And, one way I think about
it is with bot traditional chat bots,
you're actually behind the scenes.
You're spending a lot of time
as a developer coding
each of those ways
that a conversation could go.
So think about the last time.
Maybe you talk to, your cable company,
right?
Maybe you started
with like a billing question,
and then they tried to make you upgrade
or someone had to go and go
in code every single way that, oh,
that conversation could go.
So with, with copilot and generative AI,
you know, it's a lot more natural.
You just have the natural language
processing and it can reason through
all of those different requests,
without additional coding.
So like you're setting up
the overall strategy of the system versus
figuring out all the different pathways
and all the different possibilities
that you need to like code to prepare for.
Very helpful.
I just learned something too.
okay, let's get to some of our questions
here.
Plus, now that we've got everybody on the
I think it's like more than warm couch,
but it's like goes from hot since like now
everybody's in the hot seat.
So this is actually coming to you.
This is from Nelson on Salesforce.
Plus, with the new circle enhancement
for apex are the current governor
limits 100,000 characters,
200 queries per batch remaining unchanged.
Yeah, those are going to remain unchanged
for sure.
I mean, those are so that's why you need
to be careful of what you are using there.
Great. Good to know. All right.
I'm actually coming back to you.
about, again, kind of a proper use case
for a proper technology.
So, can we give it
a little bit of an example of a difference
Use case versus batch apex user.
we talked a little bit about this,
but can you just kind of clarify.
So batch apex as I said, you know,
let's let's use an analogy
like, you know,
I love how I was using analogy.
Think of batch effects as a slow cooker.
You know, and kubectl classes
that we have as a microwave.
So microwave can get things done
very quickly.
So what quibbles do not have
I mean and until now
you couldn't like, you know, divide
your large chunk of work into smaller.
I mean, there was no way than doing it
in batch classes.
So if you have like a long process,
which you are okay, comfortable with,
having it execute in its own time,
use batch effects.
You know, that's the use case for.
But if you have something
which you can think that
the platform can process
quickly, cubes are the way to go.
And now with the cursors,
we are giving you that power
that batch effects
had like ability to batch things up.
You know we are making cubes
equivalent to two batches.
So hopefully that that clarified it.
Really the slow cooker slow cooker versus
the microwave.
The analogy is is very powerful.
I love that you can just zap it
instead of waiting all day.
Okay, Luke, I'm going to come to you
now, this is a question from
Vamsi on Salesforce Plus,
how okay, how should someone
try to play around with Einstein Copilot,
Prompt Builder, and Gen AI features?
How does someone try
and play around with them?
I would point you to trailhead.
we have a bunch of new modules
there for prompt builder, for copilot.
for the entirety of our gen AI platform,
you can get a short
live demo org to experiment
with all these new functionalities.
use cases, build stuff yourself,
see what works, see what doesn't.
and then go iterate on it from there.
And we would like to add on that,
I mean, we have been running programs
on all of these
especially developer advocacy team.
So we have AI now tours
where we go to every city, organize
these workshops where we give, you know,
free hands on experience on all of this.
You know, they get an arc for,
I think five days. Daniel.
Yeah, five days,
where they can play with this tool.
It's a workshop. Guided workshop.
and then there are a lot of virtual
workshops that we are doing, like,
we get on a live stream
and we teach people
how to work with these new technology.
a lot of blog articles that come on
our channel, you know, YouTube,
of course, we post a lot of videos.
So there are a lot of ways to to learn.
And, you know, I would like to say that
ARGs are no more the constraint
like one constraint that I used to hear
from our developers, like,
hey, how do I get an ARG with Einstein
Copilot and Prompt Builder?
You know, with trailhead
you get it right away like five days. ARG.
But when you attend these workshops,
you know, we give you enough, you know,
we have no ARGs at least to go through
all of these hands on experience.
I believe we have a farm for org.
I know,
I think that's for inside baseball,
but I thought it was fun to talk about.
Okay.
But yeah, to your point, it's a great it's
easy way to get hands on.
if you want to get access to these
amazing workshops, these virtual events
developers that Salesforce.com,
one of my favorite websites,
so you can get everything there.
so let's take a moment
and kind of talk about more big picture.
So you all showed a lot of innovation
today.
some amazing innovations,
especially with Data Cloud.
I nearly got me all excited.
but in terms of just overall
where this is going with data cloud
with AI, kind of and, you know, the future
of developing on Salesforce,
I'd love to get some ideas from you about
what are you most looking forward to.
Kind of just big picture down the line.
Where do you see a lot of this going?
What excites you
the most about doing this work?
Angela, I'll start with you. Yeah.
I know one of the things
we hear a lot from customers and users,
and it's also what I'm most excited
about as well, because it's what I work
on, most of the time is, making copilot
more personalized and more proactive.
And I know that is one of the things
that we hear a lot from our customers,
like, this is what we really need to do
is like, help us finish the work.
As if I had an assistant
like a true assistant.
They can, you know, delegate tasks, units
automatically doing all those things.
And I think where it's relevant
for developers is that, you know, we know,
as we said, we already have all the tools
that are automating processes for you,
whether they live in flows or apex
or anywhere else.
so we want to start providing a way for,
developers
to utilize those things
and make copilot more proactive.
Again, making it easier to be a developer,
right? Yeah.
Luke, how about you? Yeah.
you've heard this, like, a thousand times.
generative AI is only as good as the data
that you give it, right?
and I'll touch on it again,
the idea of working with unstructured data
as a grounding resource
in your prompt templates in copilot,
it's going to unlock
so many use cases across your org.
you're it's actually
and you know all about
the value of unstructured data, right.
can you actually just explain it for
maybe those of you who aren't, those of us
watching, I mean, I of course
I know everything, but,
you know, let's talk about
what is just like high picture.
Yeah, high overview picture.
What is the difference between
unstructured data and unstructured data.
So another analogy, structured
data fits in a spreadsheet.
Structured data has fields right,
that there's a key value relationship
You don't necessarily know the format.
there's there's no object model
associated with the data coming in.
and so you can think of this is it
often manifests
as emails or slack messages or meeting
transcripts right at your organization.
There's so much unstructured data that is
so rich in the use cases that it enables.
You also get bonus
points for mentioning slack.
So thank you I appreciate that.
Yes, I can do I love it like we all do.
What are you most looking forward to in
the future of developing on the platform?
one of the things that I really,
really love
is how easy it's getting for developers
and how much less time
it's taking to do things
that used to take a really,
really long time.
Like just being able,
from what I showed today, to be able
to hook up to Heroku and get the data in
and as little as five minutes,
you can do it
with so many external systems
now with Data Cloud, you can do it with
snowflake, Big Query, I mean,
and we are releasing connectors so fast
and it's just really, really growing like
how quickly you can get data in to data
cloud from so many different sources.
So I'm excited for all the new connectors,
especially the ones that everyone
has been asking about,
like the redshift and everything.
And I'm also really, really excited
about sandboxes, of course.
So everyone out there, yes,
I am very excited about sandboxes.
and also all the new AI functionality
that's coming out for data cloud.
So semantic search,
the vector database, drag
all of that which can work with that
structured and unstructured data.
that we were just talking about.
So I'm just really, really excited
about how the platform has grown
and also
just how fast it's all been happening.
I mean, a year and a half ago,
we've never been knowing
like half of the terms
that we just talked about.
I mean, we do have a sentence
that came out of your mouth
who would've been like, if that that.
So it is
it is amazing the pace of innovation
and just all of the things
it's unlocking for folks.
All right, Mo your turn. Yeah.
So I would definitely complement
what Danielle said is
and this is one of the questions
that I always think is like,
okay, how is it changing.
You know, developers
like if like what is the future for us
as developers who are like, you know,
hands on, always on the keyboard.
And now I see this generative
AI generating all these keyboard,
like, all these code, you know,
automagically or automatically whatever.
But, you know,
I think about it, but, the, the,
the reality is, as a developer, I'm
really excited about all these changes.
The reason
is, you know, productivity gain.
You know, it's it's making me so much
productivity like myself.
I mean, if I look at look back
how I used to work, like a year ago,
and now I think I have transformed
my developer workflow,
you know, and I can tell more like,
you know, I use all these tools, you know,
for building some of these demos as well,
you know, transparently.
I use, you know, I it's time for
developers to get all these code snippets.
It's just making me, more productive,
you know, and that doesn't mean
that I'm having less work to do.
I know you're still working very hard
just in doing this.
The stuff, but I'm more productive,
you know, I'm more productive, and I'm
happy to delegate, you know, work that
I think, you know, I can do it for me.
I'm happy to delegate it, too,
because it used to be boring.
Like,
you know, even typing code is to be a
I mean, it it is
it is a, you know, a laborious task.
You know, I, I know sometimes I'm like,
okay, I know this thing, you know,
I know how this is supposed to work.
but I still have to get onto my keyboard
and keep typing this, you know?
So, that's the change that I have is like,
okay, I know the thing.
I can talk to someone
and it will give me the code now.
It might not be like 100% accurate.
You know, maybe it's 50% accurate.
But then at that point, you know,
I can use my intelligence that and that's
that's what I think as a humans,
as a developers, we are still valuable.
You know, in my opinion, personally,
I don't think there is any question
on the value, you know, that
we bring to the table as developers.
So I just have to refine,
keep refining that.
And sometimes it's like,
okay, you can't do this better than me.
Let me just take over it and do the stuff.
Well, it's like,
you know, you are an artist, right?
to make all the individual paint colors,
you have somebody providing that for you.
So you can just go ahead
and create your amazing creation.
Absolutely. So that's wonderful.
Well,
thank you all so much for being here.
There were a ton of questions in the chat.
If we did not get to your question, please
make sure you ask it in the trailblazer
community, we've got experts standing by
there to make sure you get them answered.
And really, thank you all for being here
and sharing all this innovation.
you got real deep into some amazing new
things that developers can get hands on
with and really kind of change the game
when it comes
to developing on the platform.
There's one more thing I would like to
add on is Dreamforce is coming.
And we have call for speakers.
Yes. So we are definitely looking for
speakers and that is call for speakers.
So please you know make sure
that if you are interested in you know
or if you are coming to Dreamforce
you can be better speaking on stage.
So we have all that information
on our on our developer blog.
So we are definitely looking for speaker
submissions.
So I'm really excited to see, I believe
that closes soon right June 6th June 6th.
Yeah yeah. So get those ideas together.
And again developer Dot
Salesforce.com is a great place to go.
You get all the information
you need to submit for Dreamforce
and if you even if you don't want to
submit, come join us.
It's an amazing chance
to get hands on with all the technology
and really get that learning experience
from our amazing experts.
So I'll see you all at Dreamforce.
Thank you all for being
here and thank you all for tuning in.
Again, we've got some amazing resources
for you as developers.
Make sure you check out the Salesforce
Developer's Guide to the Summer
By scanning the QR code
you see right here on the screen,
and you're going to find more highlights
on the latest features
and functionality
to help you succeed as a developer.
Now, even though we're wrapping up this
show, you can still ask your questions,
like I said, and the trailblazer community
will make sure you get an expert answer.
There's still more release Readiness
Live to come up next at 11 a.m.
Pacific,
we have CRM analytics, and then tomorrow
we've got two more shows
with service and admin preview.
Thank you so much for tuning in and we'll
catch you next time in the cloud. You.