Last week, you might have seen an advertisement go by on the blog looking for an Amsterdam-based freelance writer and editor for Project Gooseberry. Well, I’m thrilled to say that this position has been filled… by me! American-born, Dutch-adopted copywriter and journalist Elysia Brenner.
That means for one-to-ten+ months I’ll be the new voice on the Gooseberry blog, etc., giving your a more in-depth look at the action taking place in the Blender Institute’s Amsterdam studio. From the day-to-day work that’s going into the Gooseberry pilot to the full development process behind the script, the look, characters, and more, it’s my job to share with you everything you want to know.
Which begs the question… What do you want to know??? Please tell me in the comments what kind of news you’d like to see more of and I will do everything I can to make that happen.
I’m also attending the Blender Conference, taking place at Amsterdam’s De Balie today and this weekend. (I’ve joined the team at the perfect moment for total immersion training.) I’m new to the Blender community, so if you’re here at the conference as well and you see me, please do come up and say hi. I look something like this:
Looking forward to getting to know the already very welcoming Blender family, and to sharing with you guys one of my personal fangirl passions: open-source 3D animation!
I look forward to reading your scoops!
I don’t know if this is in the realm of suggestions you were looking for, but: I would like to see how the movie is inspiring and pushing new tech into Blender, especially ones that the movie wouldn’t be feasible without. Stuff on cloud and version control tech would also be cool.
Thanks, Reavenk, this is exactly the sort of thing I had in mind. ;-) On it!
I think it might take some of their time but I’m also sure a lot of animators and people working on the project would like to tell stories of their work.
For example: a certain scene that was really hard to do, making a shader that originally seemed impossible, “cheats” that no one will know about because the camera doesn’t see it, easter eggs, etc.
Definitely report on “normal” things (progress updates, milestones, what’s happening, new tech developed for the project), but an occasional oddball piece would add some spice. ;)
Excellent idea, pleased to meet you Elysia !
For my part I am most interested in visual development. Many concepts and previews have already been published, but I’d love to hear more details on the method(s) used to work on the look of the film : how goals are defined and what solutions are implemented to meet them, whether technical or artistic.
Due to the start of my bachelor thesis (which is partly about Blender Development), I’d love to see something about Blender Development in General. Maybe you could report about how the professionals at the Blender Institute develop a new feature from start to finish?
I will also do that on my blog about my feature (Point Cloud support in Blender). So this would be a great help, learning from the Pro’s.
Thank you and have fun working with the gooseberry project :)
Hi Elysia! and welcome to the team.
As Illustrator and painter i am interested in the visual development. Concepts, see artist working in their Desktops , Specially if they draw or paint traditionally like watercolors or pencil sketches. That is interesting because i can follow how the artist “thinks” and i feel like i am here. Also i would like a deep interview on how are they inspired and their proccess.
And.. as a gossip, i would like an inmersive experience with Go-pro or something like that a tour of the studio from the awakening till the time everybody go to sleep, how people live there like big brother respecting their privacy for sure, and giving us an idea on how they organize their schedules.
Thanks for your work. And courage to the team!
It would be great if from time to time you interviewed each member, one at a time, about their professional background (how they started, where/how did they learn 3d, previous works, etc.) and some sort of over view of “global and local” workflows: what’s their work and how they do it, and its relation with other people’s jobs, e.g. rigger telling about his workflow, like dividing a rig in simpler parts, and how he connects with animators, modelers and developers to make a better job.
I know editing video adds quite a lot of work, but interviews could be short, like no longer than 15 minutes, or you could publish audio or transcriptions, like BlenderDiplom’s interviews or BlenderGuru’s podcast, just don’t do a 1 or 2 hours interview like Andrew does.
This is all fantastic stuff, and definitely along the lines of what we had planned, so it’s great to hear we’re on the same page.
Thanks to everyone who’s chimed in so far, and keep the great ideas coming!
Weekly presentation are really painful to watch and all the video created if you can help here it could be fine.
I like the weekly presentations alot and hope you don’t stop making them. It really helps people to put a personality with a name.
I like seeing them all get along and being happy and creating something great. I like laughing along with them and seeing the development and especialy the character creation parts.
The only thing I would change is the jittery in the camera and adjust the audio somehow to make it clearer. It’s hard sometimes hearing what they are saying.
So when will the next weekly present. be comming out?
You can watch the weekly live today at 18:00 CET on the Gooseberry YouTube channel (https://www.youtube.com/user/ProjectGooseberry)…or wait to read the highlights on this blog on Monday! :-)
I loved the sintel video’s with the artists/developers geeking out over new features/hardware/funny bugs I loved the rough and uncut feel.
Dont polish your stories too much, quantity over quality please!
I’d like to read about the who’s who and who’s doing what,
but also to read about their story, the key moment they wish to share, etc.
Do you have an organizational breakdown structure for the project?
Hi Fab! We actually just added a new page on Friday all about this: /the-team/
Hi, Elysia! [Did your parents really give you that name?]
What I am most hoping for, for Blender via Gooseberry,
is a substantial improvement on the Hair system:
Currently, if one specifies an influence of gravity,
with the expectation that hairs growing against gravity
will flatten a bit, etc., one is faced with the fact that
hairs that grow in the direction of gravity get elongated
(which, of course, is not good).
Having a tweakable and believable (length constant) gravity effect
on hair and long fur would be much appreciated for animation,
so the artist doesn’t have to comb at every keyframe.
The Particle “Group” option, that chooses a group member at random,
is most welcome.
It allows you, for example, to model a set of curly hairs as curves,
complete with shapekeys, drivers, and all that; group them, etc.
But particles emitted from faces need to have a scaling option,
in the same way as DupliFaces!
The ability to treat Curve Splines as Strands would be great:
in particular, the availability of “intercept” or “u coordinate” to Cycles;
otherwise one is forced to explicitly convert perfectly good curves
(beveled and tapered) into meshes. Brute force!
Haha, yes, that is my real name. ;-) Thanks for your feedback; I’ll pass this comment onto Lukas!
Hats off, Elysia!
About gravity influence: There are two primary forces acting against gravity:
– Bending stiffness, which preserves the local shape of each hair
– Goal stiffness, which pulls hairs back to a global shape, relative to the surface
The goal stiffness effect is not physically correct, but easy to use and a more stable for small deformations.
Both of these forces are based on a “rest shape” that defines the force-free basis of the hair geometry. Currently this rest shape matches exactly the grooming (hair edit). However, one usually models a natural hair style under the influence of gravity – most characters are not floating in space when you model them! This means that one either has to take gravity into account when grooming (tedious and clumsy) – or the hair sim has to provide a way to calculate a force-free rest shape based on the initial grooming. This latter approach does not exist in Blender yet, but could avoid these stretching issues. The method is not quite simple though, so i can’t make a roadmap promise. (For those interested in the math: https://hal.inria.fr/hal-00857559/file/inverseHairModeling.pdf)
Thank you, Lukas! Downloading the pdf even as I type.
What I’m trying now is curve (beveled, radius-tapered) => mesh (edit some more); two copies: one, doing softbody, the other, shapekeys: transfer sofbody results (various angles of inclination) to shapekeys. Then PyDrivers for shapekey values, depending on angle of inclination.
I’m faced with the fact that particle( object, choose one from group ) on, say, a shin, emitted from faces, does not SCALE according to face area. Could you guys please add that scaling option, pretty please?
P.S. Wouldn’t it be great to have the option of associating shapekey values with the object block instead of, necessarily, the data block?
Then linked duplicates of complex objects with lots of shapekeys could show different shapes (expressions, etc.) without having to duplicate all the coordinates and deltas! Right?